CN115690221A - Abnormal positioning method and device for camera calibration and storage medium - Google Patents

Abnormal positioning method and device for camera calibration and storage medium Download PDF

Info

Publication number
CN115690221A
CN115690221A CN202110838211.2A CN202110838211A CN115690221A CN 115690221 A CN115690221 A CN 115690221A CN 202110838211 A CN202110838211 A CN 202110838211A CN 115690221 A CN115690221 A CN 115690221A
Authority
CN
China
Prior art keywords
camera
image
target
parameter
difference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110838211.2A
Other languages
Chinese (zh)
Inventor
张超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202110838211.2A priority Critical patent/CN115690221A/en
Publication of CN115690221A publication Critical patent/CN115690221A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Studio Devices (AREA)

Abstract

The disclosure relates to an abnormal positioning method, an abnormal positioning device and a storage medium for camera calibration, wherein the method comprises the following steps: acquiring an image acquired by a first camera of a terminal for a target object to obtain a first image; acquiring an image acquired by a second camera of a reference terminal for the target object to obtain a second image, wherein the second camera is a camera corresponding to the first camera of the terminal in the reference terminal; calculating a difference parameter between the first camera and the second camera based on the matched feature pairs in the first image and the second image; and determining influence factors of calibration abnormity of the camera according to the difference parameters. The technical scheme provides data basis for the determination process of the influence factors of the camera calibration abnormity, so that the influence factors of the camera calibration abnormity are favorably and accurately positioned, and the production efficiency is favorably improved.

Description

Abnormal positioning method and device for camera calibration and storage medium
Technical Field
The present disclosure relates to the field of camera calibration technologies, and in particular, to a method and an apparatus for locating an abnormality in camera calibration, and a storage medium.
Background
The camera is used as an important component of a plurality of digital products, and the functions of the camera are deeply integrated into the life of a user. With the continuous development of science and technology, the performance requirement on the camera is higher and higher, wherein the calibration of the camera is an important link, and the calibration result and the precision can directly influence the accuracy of the subsequent work of the image system.
In a relevant scene, due to interference of some influence factors, abnormality may also occur in the camera calibration process, and at this time, the influence factors need to be distinguished. In the related art, the analysis of the abnormal cases is generally performed manually. However, such an analysis method depends on the experience of the relevant person, and the analysis result is subjective and has guessed components.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides a camera calibration anomaly positioning method, device and storage medium.
According to a first aspect of the embodiments of the present disclosure, there is provided a method for locating an abnormality in camera calibration, including:
acquiring an image acquired by a first camera of a terminal for a target object to obtain a first image;
acquiring an image acquired by a second camera of a reference terminal to the target object to obtain a second image, wherein the second camera is a camera in the reference terminal corresponding to the first camera of the terminal;
calculating a difference parameter between the first camera and the second camera based on the matched feature pairs in the first image and the second image;
and determining the influence factors of the calibration abnormity of the camera according to the difference parameters.
Optionally, the computing the difference parameter between the first camera and the second camera based on the matched feature pairs in the first image and the second image comprises:
calculating a first distance value between each of the first feature points in the plurality of feature point pairs based on a plurality of feature point pairs; and are
Calculating a second distance value between each of the second feature points in the plurality of feature point pairs, the difference parameter including the first distance value and the second distance value.
Optionally, the determining, according to the difference parameter, an influencing factor of a camera calibration abnormality includes:
when the first distance value is larger than the second distance value, determining that the influence factors of abnormal calibration of the camera comprise that the focusing of the first camera is closer;
and when the first distance value is smaller than the second distance value, determining that the influence factor of abnormal camera calibration comprises that the focusing of the first camera is farther.
Optionally, the first camera includes a first target camera and a second target camera, and the first image includes a first target image acquired by the first target camera and a second target image acquired by the second target camera; the second camera comprises a first reference camera and a second reference camera, and the second image comprises a first reference image acquired by the first reference camera and a second reference image acquired by the second reference camera; the calculating a difference parameter between the first camera and the second camera based on the matched feature pair in the first image and the second image comprises:
calculating a first rotation parameter and a first translation parameter between the first target camera and the first reference camera based on the matched feature point pairs in the first target image and the first reference image;
calculating a second rotation parameter and a second translation parameter between the second target camera and the second reference camera based on the matched feature point pairs in the second target image and the second reference image;
wherein the difference parameters include the first rotation parameter, the first translation parameter, the second rotation parameter, and the second translation parameter.
Optionally, the determining, according to the difference parameter, an influencing factor of a camera calibration abnormality includes:
calculating a first difference value between the first rotation parameter and the second rotation parameter;
determining that the influence factor of abnormal camera calibration includes rotation between the first target camera and the second target camera when the first difference value is greater than a rotation difference threshold;
calculating a second difference value between the first translation parameter and the second translation parameter;
when the second difference value is greater than the translation difference threshold value, determining that the influence factor of abnormal camera calibration includes that translation exists between the first target camera and the second target camera.
Optionally, the method further comprises:
transforming the second target image based on the first rotation parameter and the first translation parameter;
and calculating a third rotation parameter and a third translation parameter between the first camera and the second camera based on the matched feature point pairs in the transformed second target image and the second reference image.
Optionally, the first camera includes a first target camera and a second target camera, and the first image includes a first target image acquired by the first target camera and a second target image acquired by the second target camera; the second camera comprises a first reference camera and a second reference camera, and the second image comprises a first reference image acquired by the first reference camera and a second reference image acquired by the second reference camera; the calculating a difference parameter between the first camera and the second camera based on the matched feature pair in the first image and the second image comprises:
calculating a first rotation parameter and a first translation parameter between the first target camera and the first reference camera based on the matched feature point pairs in the first target image and the first reference image;
transforming the second target image based on the first rotation parameter and the first translation parameter;
and calculating a third rotation parameter and a third translation parameter between the first camera and the second camera based on the matched feature point pairs in the transformed second target image and the second reference image, wherein the difference parameter comprises the third rotation parameter and the third translation parameter.
According to a second aspect of the embodiments of the present disclosure, there is provided an abnormal positioning device calibrated by a camera, including:
the first acquisition module is configured to acquire an image acquired by a first camera of the terminal on a target object to obtain a first image;
the second acquisition module is configured to acquire an image acquired by a second camera of the reference terminal on the target object to obtain a second image, wherein the second camera is a camera corresponding to the first camera of the terminal in the reference terminal;
a first calculation module configured to calculate a difference parameter between the first camera and the second camera based on the matched feature pairs in the first image and the second image;
a determining module configured to determine an influencing factor of the camera calibration anomaly according to the difference parameter.
Optionally, the feature pairs are feature point pairs, each feature point pair including a first feature point located in the first image and a second feature point located in the second image, and the first calculation module includes:
a first calculation submodule configured to calculate a first distance value between each of the first feature points in the plurality of feature point pairs based on a plurality of feature point pairs;
a second calculation submodule configured to calculate a second distance value between each of the second feature points in the plurality of feature point pairs, the difference parameter including the first distance value and the second distance value.
Optionally, the determining module includes:
a first determination submodule configured to determine that the influence factor of the camera calibration abnormality includes that the focus of the first camera is closer when the first distance value is greater than the second distance value;
a second determination submodule configured to determine that the influencing factor of the camera calibration abnormality includes that the focus of the first camera is farther, when the first distance value is smaller than the second distance value.
Optionally, the first camera includes a first target camera and a second target camera, and the first image includes a first target image acquired by the first target camera and a second target image acquired by the second target camera; the second camera comprises a first reference camera and a second reference camera, and the second image comprises a first reference image acquired by the first reference camera and a second reference image acquired by the second reference camera; the first computing module, comprising:
the third calculation sub-module is configured to calculate a first rotation parameter and a first translation parameter between the first target camera and the first reference camera based on the matched feature point pairs in the first target image and the first reference image;
a fourth calculation submodule configured to calculate a second rotation parameter and a second translation parameter between the second target camera and the second reference camera based on the matched pairs of feature points in the second target image and the second reference image;
wherein the difference parameter comprises the first rotation parameter, the first translation parameter, the second rotation parameter, and the second translation parameter.
Optionally, the determining module includes:
a fifth calculation submodule configured to calculate a first difference value between the first rotation parameter and the second rotation parameter;
a third determination submodule configured to determine that, when the first difference value is greater than a rotation difference threshold, an influencing factor of a camera calibration abnormality includes rotation between the first target camera and the second target camera;
a sixth calculation submodule configured to calculate a second difference value between the first translation parameter and the second translation parameter;
a fourth determination submodule configured to determine that the influencing factor of the camera calibration anomaly includes that there is translation between the first target camera and the second target camera when the second difference value is greater than a translation difference threshold value.
Optionally, the apparatus further comprises:
a first image transformation module configured to transform the second target image based on the first rotation parameter and the first translation parameter;
and the second calculation module is configured to calculate a third rotation parameter and a third translation parameter between the first camera and the second camera based on the matched feature point pairs in the transformed second target image and the second reference image.
Optionally, the first camera includes a first target camera and a second target camera, and the first image includes a first target image acquired by the first target camera and a second target image acquired by the second target camera; the second camera comprises a first reference camera and a second reference camera, and the second image comprises a first reference image acquired by the first reference camera and a second reference image acquired by the second reference camera; the first computing module, comprising:
a seventh calculation submodule configured to calculate a first rotation parameter and a first translation parameter between the first target camera and the first reference camera based on the matched pairs of feature points in the first target image and the first reference image;
an image transformation submodule configured to transform the second target image based on the first rotation parameter and the first translation parameter;
an eighth calculation submodule configured to calculate a third rotation parameter and a third translation parameter between the first camera and the second camera based on the matched pairs of feature points in the transformed second target image and the second reference image, wherein the difference parameter includes the third rotation parameter and the third translation parameter.
According to a third aspect of the embodiments of the present disclosure, there is provided an abnormal positioning device calibrated by a camera, including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring an image acquired by a first camera of a terminal for a target object to obtain a first image;
acquiring an image acquired by a second camera of a reference terminal to the target object to obtain a second image, wherein the second camera is a camera in the reference terminal corresponding to the first camera of the terminal;
calculating a difference parameter between the first camera and the second camera based on the matched feature pairs in the first image and the second image;
and determining the influence factors of the calibration abnormity of the camera according to the difference parameters.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a computer-readable storage medium, on which computer program instructions are stored, which when executed by a processor, implement the steps of the method for locating an abnormality of a camera calibration provided in the first aspect of the present disclosure.
The technical scheme provided by the embodiment of the disclosure at least has the following beneficial effects:
the first camera of the terminal and the second camera of the reference terminal are used for respectively acquiring images of the target object, so that a first image and a second image can be respectively obtained. In addition, feature extraction and feature matching can be performed on the first image and the second image, so that a feature pair matched with the first image and the second image is obtained. In this way, the difference parameter between the first camera and the second camera can be calculated based on the feature pair, and the influence factor of camera calibration abnormity can be determined according to the difference parameter. That is to say, the technical scheme provides a data basis for the determination process of the influence factors of the calibration abnormality of the camera, so that the influence factors causing the calibration abnormality of the camera can be accurately located, and the production efficiency can be further improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart illustrating an anomaly locating method for camera calibration according to an exemplary embodiment.
FIG. 2 is a flow diagram illustrating the calculation of a difference parameter according to an exemplary embodiment.
Fig. 3 is a flowchart illustrating a camera calibration anomaly locating method according to an exemplary embodiment.
Fig. 4 is a flowchart illustrating a camera calibration anomaly locating method according to an exemplary embodiment.
Fig. 5 is a block diagram illustrating a camera-calibrated anomaly locating device according to an exemplary embodiment.
FIG. 6 is a block diagram illustrating an apparatus in accordance with an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the disclosure, as detailed in the appended claims.
Fig. 1 is a flowchart illustrating a camera calibration anomaly locating method according to an exemplary embodiment, and as shown in fig. 1, the method may include the following steps.
In step S11, an image acquired by a first camera of the terminal for the target object is obtained, so as to obtain a first image.
The first camera may be a camera that needs to be calibrated in the terminal. The target object can be a target map, and the target map can be presented in the form of a square grid map, a combination of the square grid map and a dot, and the like, depending on different application scenarios. For example, the first camera may be a main shot of the terminal, the target object may be a square grid chart, and step S11 may refer to performing image acquisition on the square grid chart through the main shot of the terminal to obtain the first image.
In step S12, an image acquired by the second camera of the reference terminal for the target object is obtained, so as to obtain a second image. Here, the reference terminal may be a standard terminal of the same type as the terminal, and may be selected in a related screening manner in specific implementation. For example, a terminal where the difference of the results of the bi-focal calibration is smaller than a first difference threshold and the external parameters (such as rotation parameters and translation parameters) of the bi-focal calibration during a single test are smaller than a second difference threshold (for example, the external parameters are close to 0) may be used as the reference terminal.
The second camera may be a camera in the reference terminal corresponding to the first camera of the terminal. Following the above example, when the first camera is the main shot of the terminal, the second camera may be the main shot of the reference terminal. Similarly, when the first camera is a macro lens of the terminal, the second camera is a macro lens of the reference terminal.
In step S13, a difference parameter between the first camera and the second camera is calculated based on the matched feature pairs in the first image and the second image.
Wherein the feature pairs may be acquired based on the first image and the second image. For example, in some implementation scenarios, the feature extraction may be performed on the first image and the second image by using a corner extraction algorithm, an edge detection algorithm, or the like. In the case that the target object is a square grid diagram, the extracted features may be each corner point of the square grid and pixel coordinates of each corner point, and when the target object further includes a dot, the extracted features may further include a center point of the dot and pixel coordinates of the center point.
For the extracted features, the features can be matched in an optical flow method and other modes, so that matched feature pairs can be obtained. For example, in the case where the extracted features are feature points, the obtained feature pairs may be feature point pairs each including a first feature point located in the first image and a second feature point located in the second image. The first feature point and the second feature point may correspond to the same point in the target object.
In this way, a difference parameter between the first camera and the second camera may be calculated based on the matching pairs of features in the first image and the second image. For example, in one possible embodiment, the first camera includes a first target camera and a second target camera. Here, the first target camera and the second target camera may be a pair of cameras that need to perform double-shot calibration, and in a specific implementation, the first target camera and the second target camera may be any two cameras included in the terminal. The first image comprises a first target image acquired by a first target camera and a second target image acquired by a second target camera.
The second camera comprises a first reference camera and a second reference camera. The first reference camera is a camera in the reference terminal corresponding to a first target camera of the terminal, and the second reference camera is a camera in the reference terminal corresponding to a second target camera of the terminal. The second image comprises a first reference image acquired by the first reference camera and a second reference image acquired by the second reference camera.
In this case, referring to a flowchart of calculating the difference parameter shown in fig. 2, calculating the difference parameter between the first camera and the second camera based on the matched feature pair in the first image and the second image (step S13) includes:
and calculating a first rotation parameter and a first translation parameter between the first target camera and the first reference camera based on the matched characteristic point pairs in the first target image and the first reference image. Transforming the second target image based on the first rotation parameter and the first translation parameter. Therefore, the transformed second target image fuses the pose difference between the first target camera and the first reference camera, so that the transformed second target image can visually reflect the overall difference between the camera of the terminal and the camera of the reference terminal, and is beneficial to positioning the influence factors of abnormal calibration of the camera.
In addition, a third rotation parameter and a third translation parameter between the first camera and the second camera may be calculated based on the matched pairs of feature points in the transformed second target image and the second reference image, and the difference parameter includes the third rotation parameter and the third translation parameter.
It should be understood that, since the transformed second target image fuses the pose difference between the first target camera and the first reference camera, the calculated third rotation parameter and the third translation parameter can quantitatively describe the overall difference between the camera of the terminal and the camera of the reference terminal based on the matched feature point pairs in the transformed second target image and the second reference image.
Of course, in some implementation scenarios, the calculation process of the difference parameter may also involve internal parameters (such as focal length, principal point, and the like) of the first camera and the second camera, which is not limited by this disclosure.
In this way, in step S14, the influence factor of the camera calibration abnormality is determined according to the difference parameter.
Following the above example, in some implementation scenarios, corresponding value intervals may be set for the third rotation parameter and the third translation parameter based on application requirements. When the rotation degree represented by the third rotation parameter exceeds the rotation degree interval, determining that the influence factors of abnormal calibration of the camera comprise the rotation of the camera of the terminal; when the translation amount represented by the third translation parameter exceeds the translation amount interval, determining that the influence factor of abnormal camera calibration comprises the existence of translation of the camera of the terminal.
In the technical scheme, the first camera of the terminal and the second camera of the reference terminal are used for respectively acquiring images of the target object, so that the first image and the second image can be respectively obtained. In addition, by performing feature extraction and feature matching on the first image and the second image, a feature pair matching the first image and the second image can be obtained. In this way, the difference parameter between the first camera and the second camera can be calculated based on the feature pair, and the influence factor of camera calibration abnormity can be determined according to the difference parameter. That is to say, the above technical solution provides a data basis for the determination process of the influence factor of the camera calibration abnormality, thereby being helpful for accurately locating the influence factor causing the camera calibration abnormality. For example, in the production process, the influence factors causing abnormal calibration of the camera can be quickly and accurately positioned through the technical scheme, so that the influence factors can be timely processed, and the production efficiency can be improved.
Fig. 3 is a flowchart of an abnormal positioning method for camera calibration according to an exemplary embodiment of the present disclosure, and with reference to fig. 3, the method includes:
s31, acquiring an image acquired by a first camera of the terminal to the target object to obtain a first image.
And S32, acquiring an image acquired by the second camera of the reference terminal to the target object to obtain a second image.
S33, calculating a first distance value between the respective first feature points in the plurality of feature point pairs based on the plurality of feature point pairs. Wherein each feature point pair includes a first feature point located in the first image and a second feature point located in the second image.
And S34, calculating a second distance value between each second feature point in the plurality of feature point pairs, wherein the difference parameter comprises the first distance value and the second distance value.
Illustratively, taking two pairs of feature point pairs as an example, the feature point pair 1 may include a feature point A1 located in the first image and a feature point B1 located in the second image; the pair of feature points 2 may include a feature point A2 located in the first image and a feature point B2 located in the second image. In this way, the distance value between the feature point A1 and the feature point A2 can be calculated to obtain the first distance value. The distance value between the feature point B1 and the feature point B2 may be calculated to obtain the second distance value. Here, the distance value between feature points may be presented in terms of euclidean distance or the like. In addition, since the distance value between the feature points may be small, in order to improve the calculation accuracy of the distance value, the square root calculation formula may be replaced with a square calculation formula when calculating the euclidean distance.
After the first distance value and the second distance value are obtained through calculation, the influence factors of camera calibration abnormity can be determined according to the first distance value and the second distance value.
And S35, when the first distance value is larger than the second distance value, determining that the influence factors of abnormal calibration of the camera comprise that the focusing of the first camera is closer.
And S36, when the first distance value is smaller than the second distance value, determining that the influence factors of abnormal camera calibration comprise that the focusing of the first camera is far.
It should be noted that, when the camera is used for shooting a target object, the focusing distance can be characterized as becoming smaller or larger in the image. Thus, by calculating a first distance value between first feature points in the first image and comparing a second distance value between second feature points in the second image, a difference in focus between the terminal and the reference terminal can be determined. In addition, the first camera can be any camera that the terminal needs to be calibrated, such as a main shot, a sub-shot, and the like of the terminal.
That is to say, the above technical solution can determine whether there is interference of a focusing factor in the camera calibration process by calculating a first distance value between first feature points in the first image and comparing a second distance value between second feature points in the second image. Furthermore, under the condition that the focusing difference is determined, the relevant focusing control chip and the control strategy can be checked, so that the problem of camera calibration abnormity is solved.
Fig. 4 is a flowchart of an abnormal positioning method for camera calibration according to an exemplary embodiment of the present disclosure, and referring to fig. 4, the method includes:
s41, acquiring an image acquired by a first camera of the terminal to the target object to obtain a first image. The first camera comprises a first target camera and a second target camera, and the first image comprises a first target image acquired by the first target camera and a second target image acquired by the second target camera.
And S42, acquiring an image acquired by the second camera of the reference terminal to the target object to obtain a second image. The second camera comprises a first reference camera and a second reference camera, and the second image comprises a first reference image acquired by the first reference camera and a second reference image acquired by the second reference camera.
And S43, calculating a first rotation parameter and a first translation parameter between the first target camera and the first reference camera based on the matched characteristic point pairs in the first target image and the first reference image.
And S44, calculating a second rotation parameter and a second translation parameter between the second target camera and the second reference camera based on the matched characteristic point pairs in the second target image and the second reference image. Wherein the difference parameter comprises the first rotation parameter, the first translation parameter, the second rotation parameter, and the second translation parameter.
For example, the pose estimation calculation may be performed according to coordinate values of feature point pairs matched in the first target image and the first reference image, so as to obtain a first rotation parameter and a first translation parameter; the pose estimation calculation can be performed according to the coordinate values of the feature point pairs matched in the second target image and the second reference image, so that a second rotation parameter and a second translation parameter are obtained. For the way of calculating the translation parameter and the rotation parameter according to the coordinate values of the feature point pairs, please refer to the description in the related art, and for the brevity of the description, the disclosure is not repeated herein.
And S45, calculating a first difference value between the first rotation parameter and the second rotation parameter.
And S46, when the first difference value is larger than the rotation difference threshold value, determining that the influence factors of the abnormal calibration of the camera comprise rotation between the first target camera and the second target camera.
And S47, calculating a second difference value between the first translation parameter and the second translation parameter.
And S48, when the second difference value is larger than the translation difference threshold value, determining that the influence factors of the abnormal calibration of the camera comprise the translation between the first target camera and the second target camera.
According to the technical scheme, whether the interference of the camera rotation factors exists in the calibration process of the camera can be determined by calculating the first difference value between the first rotation parameter and the second rotation parameter and comparing the first difference value with the rotation difference threshold value. Further, under the condition that the rotation exists between the first target camera and the second target camera, it can be determined that the terminal camera is possibly abnormally assembled, and at the moment, a relevant camera assembling link can be checked and verified.
In addition, the above technical solution can also calculate a second difference value, and compare the second difference value with the translation difference threshold, thereby determining whether there is interference of the camera translation factor in the camera calibration process. Further, in a case that it is determined that there is a translation between the first target camera and the second target camera, it may be determined that the terminal camera may be abnormally assembled and/or an OIS (Optical Image Stabilization) component of the terminal is abnormally assembled, and at this time, the relevant camera assembly link and the OIS of the terminal may be checked and verified.
Following the example of fig. 4, in one possible implementation, in the case of determining the presence of a translation and/or a rotation of the camera of the terminal, the translation as well as the rotation of the camera may also be quantified. In this case the method further comprises:
transforming the second target image based on the first rotation parameter and the first translation parameter. Therefore, the transformed second target image fuses the pose difference between the first target camera and the first reference camera, so that the transformed second target image can visually reflect the overall difference between the camera of the terminal and the camera of the reference terminal, and is beneficial to positioning the influence factors of abnormal calibration of the camera.
In addition, a third rotation parameter and a third translation parameter between the first camera and the second camera may be calculated based on the matched pairs of feature points in the transformed second target image and the second reference image, and the difference parameter includes the third rotation parameter and the third translation parameter.
It should be understood that, since the transformed second target image fuses the pose difference of the first target camera and the first reference camera, the calculated third rotation parameter and the third translation parameter can quantitatively describe the overall difference between the camera of the terminal and the camera of the reference terminal based on the matched feature point pairs in the transformed second target image and the second reference image.
Based on the same invention concept, the invention also provides an abnormal positioning device for calibrating the camera. Fig. 5 is a block diagram of an abnormal positioning apparatus for camera calibration according to an exemplary embodiment of the present disclosure, where the apparatus 500 includes:
a first obtaining module 501, configured to obtain an image acquired by a first camera of a terminal for a target object, so as to obtain a first image;
a second obtaining module 502, configured to obtain an image acquired by a second camera of the reference terminal for the target object, so as to obtain a second image, where the second camera is a camera of the reference terminal corresponding to the first camera of the terminal;
a first calculation module 503 configured to calculate a difference parameter between the first camera and the second camera based on the matched feature pair in the first image and the second image;
a determining module 504 configured to determine an influencing factor of the camera calibration anomaly based on the difference parameter.
Optionally, the feature pairs are feature point pairs, each feature point pair includes a first feature point located in the first image and a second feature point located in the second image, and the first calculating module 503 includes:
a first calculation sub-module configured to calculate a first distance value between each of the first feature points in the plurality of pairs of feature points, based on a plurality of pairs of feature points;
a second calculation submodule configured to calculate a second distance value between each of the second feature points in the plurality of feature point pairs, the difference parameter including the first distance value and the second distance value.
Optionally, the determining module 504 includes:
a first determination submodule configured to determine that the influence factor of the camera calibration abnormality includes that the focus of the first camera is closer when the first distance value is greater than the second distance value;
a second determination submodule configured to determine that the influencing factor of the camera calibration abnormality includes that the focus of the first camera is farther when the first distance value is smaller than the second distance value.
Optionally, the first camera includes a first target camera and a second target camera, and the first image includes a first target image acquired by the first target camera and a second target image acquired by the second target camera; the second camera comprises a first reference camera and a second reference camera, and the second image comprises a first reference image acquired by the first reference camera and a second reference image acquired by the second reference camera; the first calculating module 503 includes:
the third calculation sub-module is configured to calculate a first rotation parameter and a first translation parameter between the first target camera and the first reference camera based on the matched feature point pairs in the first target image and the first reference image;
a fourth calculation submodule configured to calculate a second rotation parameter and a second translation parameter between the second target camera and the second reference camera based on the matched pairs of feature points in the second target image and the second reference image;
wherein the difference parameter comprises the first rotation parameter, the first translation parameter, the second rotation parameter, and the second translation parameter.
Optionally, the determining module 504 includes:
a fifth calculation submodule configured to calculate a first difference value between the first rotation parameter and the second rotation parameter;
a third determination submodule configured to determine that an influencing factor of a camera calibration anomaly includes rotation between the first target camera and the second target camera when the first difference value is greater than a rotation difference threshold;
a sixth calculation submodule configured to calculate a second difference value between the first translation parameter and the second translation parameter;
a fourth determination submodule configured to determine that the influencing factor of the camera calibration anomaly includes that there is translation between the first target camera and the second target camera when the second difference value is greater than the translation difference threshold value.
Optionally, the apparatus 500 further comprises:
a first image transformation module configured to transform the second target image based on the first rotation parameter and the first translation parameter;
and the second calculation module is configured to calculate a third rotation parameter and a third translation parameter between the first camera and the second camera based on the matched feature point pairs in the transformed second target image and the second reference image.
Optionally, the first camera includes a first target camera and a second target camera, and the first image includes a first target image acquired by the first target camera and a second target image acquired by the second target camera; the second camera comprises a first reference camera and a second reference camera, and the second image comprises a first reference image acquired by the first reference camera and a second reference image acquired by the second reference camera; the first calculating module 503 includes:
a seventh calculation submodule configured to calculate a first rotation parameter and a first translation parameter between the first target camera and the first reference camera based on the matched pairs of feature points in the first target image and the first reference image;
an image transformation submodule configured to transform the second target image based on the first rotation parameter and the first translation parameter;
an eighth calculation submodule configured to calculate a third rotation parameter and a third translation parameter between the first camera and the second camera based on the matched pairs of feature points in the transformed second target image and the second reference image, wherein the difference parameter includes the third rotation parameter and the third translation parameter.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
The present disclosure further provides an abnormal positioning device for camera calibration, including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to execute the steps of the camera calibration anomaly locating method provided by the present disclosure.
The present disclosure also provides a computer readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the steps of the camera calibration anomaly locating method provided by the present disclosure.
Fig. 6 is a block diagram illustrating an apparatus 600 for camera calibration anomaly location according to an exemplary embodiment. For example, the apparatus 600 may be a mobile phone, a computer, a messaging device, a console, a tablet device, a personal digital assistant, and the like.
Referring to fig. 6, apparatus 600 may include one or more of the following components: a processing component 602, a memory 604, a power component 606, a multimedia component 608, an audio component 610, an interface for input/output (I/O) 612, a sensor component 614, and a communication component 616.
The processing component 602 generally controls overall operation of the device 600, such as operations associated with display, data communication, camera operations, and recording operations. The processing component 602 may include one or more processors 620 to execute instructions to perform all or a portion of the steps of the camera calibration anomaly locating method described above. Further, the processing component 602 can include one or more modules that facilitate interaction between the processing component 602 and other components. For example, the processing component 602 can include a multimedia module to facilitate interaction between the multimedia component 608 and the processing component 602.
The memory 604 is configured to store various types of data to support operations at the apparatus 600. Examples of such data include instructions, messages, pictures, videos, etc. for any application or method operating on device 600. The memory 604 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power component 606 provides power to the various components of device 600. Power components 606 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 600.
The multimedia component 608 includes a screen that provides an output interface between the device 600 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 608 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 600 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 610 is configured to output and/or input audio signals. For example, audio component 610 includes a Microphone (MIC) configured to receive external audio signals when apparatus 600 is in an operational mode, such as a recording mode and a speech recognition mode. The received audio signal may further be stored in the memory 604 or transmitted via the communication component 616. In some embodiments, audio component 610 further includes a speaker for outputting audio signals.
The I/O interface 612 provides an interface between the processing component 602 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor component 614 includes one or more sensors for providing various aspects of state assessment for the apparatus 600. For example, the sensor component 614 may detect the open/closed status of the device 600, the relative positioning of components, such as a display and keypad of the device 600, the sensor component 614 may also detect a change in position of the device 600 or any component of the device 600, the presence or absence of user contact with the device 600, orientation or acceleration/deceleration of the device 600, and a change in temperature of the device 600. The sensor assembly 614 may include a proximity sensor configured to detect the presence of a nearby object in the absence of any physical contact. The sensor assembly 614 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 614 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 616 is configured to facilitate communications between the apparatus 600 and other devices in a wired or wireless manner. The apparatus 600 may access a wireless network based on a communication standard, such as WiFi,4G or 5G, or a combination thereof. In an exemplary embodiment, the communication component 616 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 616 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 600 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described camera calibration anomaly locating method.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as the memory 604 comprising instructions, executable by the processor 620 of the apparatus 600 to perform the camera-calibrated anomaly locating method described above is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the above-mentioned camera-calibrated anomaly locating method when executed by the programmable apparatus.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. An abnormal positioning method for camera calibration is characterized by comprising the following steps:
acquiring an image acquired by a first camera of a terminal for a target object to obtain a first image;
acquiring an image acquired by a second camera of a reference terminal to the target object to obtain a second image, wherein the second camera is a camera in the reference terminal corresponding to the first camera of the terminal;
calculating a difference parameter between the first camera and the second camera based on the matched feature pairs in the first image and the second image;
and determining influence factors of calibration abnormity of the camera according to the difference parameters.
2. The method of claim 1, wherein the pairs of features are pairs of feature points, each pair of feature points comprising a first feature point located in the first image and a second feature point located in the second image, and wherein calculating the difference parameter between the first camera and the second camera based on matching pairs of features in the first image and the second image comprises:
calculating a first distance value between each of the first feature points in the plurality of feature point pairs based on a plurality of feature point pairs; and are
Calculating a second distance value between each of the second feature points in the plurality of feature point pairs, the difference parameter including the first distance value and the second distance value.
3. The method according to claim 2, wherein said determining camera calibration anomaly contributors from said difference parameters comprises:
when the first distance value is larger than the second distance value, determining that the influence factors of abnormal calibration of the camera comprise that the focusing of the first camera is closer;
and when the first distance value is smaller than the second distance value, determining that the influence factor of abnormal camera calibration comprises that the focusing of the first camera is farther.
4. The method of claim 1, wherein the first camera comprises a first target camera and a second target camera, and the first image comprises a first target image captured by the first target camera and a second target image captured by the second target camera; the second camera comprises a first reference camera and a second reference camera, and the second image comprises a first reference image acquired by the first reference camera and a second reference image acquired by the second reference camera; the calculating a difference parameter between the first camera and the second camera based on the matched feature pair in the first image and the second image comprises:
calculating a first rotation parameter and a first translation parameter between the first target camera and the first reference camera based on the matched feature point pairs in the first target image and the first reference image;
calculating a second rotation parameter and a second translation parameter between the second target camera and the second reference camera based on the matched feature point pairs in the second target image and the second reference image;
wherein the difference parameter comprises the first rotation parameter, the first translation parameter, the second rotation parameter, and the second translation parameter.
5. The method according to claim 4, wherein said determining a factor affecting a camera calibration anomaly based on said difference parameter comprises:
calculating a first difference value between the first rotation parameter and the second rotation parameter;
when the first difference value is greater than a rotation difference threshold value, determining that the influence factors of abnormal camera calibration include rotation between the first target camera and the second target camera;
calculating a second difference value between the first translation parameter and the second translation parameter;
when the second difference value is greater than the translation difference threshold value, determining that the influence factor of abnormal camera calibration includes that translation exists between the first target camera and the second target camera.
6. The method of claim 5, further comprising:
transforming the second target image based on the first rotation parameter and the first translation parameter;
and calculating a third rotation parameter and a third translation parameter between the first camera and the second camera based on the matched feature point pairs in the transformed second target image and the second reference image.
7. The method of claim 1, wherein the first camera comprises a first target camera and a second target camera, and the first image comprises a first target image captured by the first target camera and a second target image captured by the second target camera; the second camera comprises a first reference camera and a second reference camera, and the second image comprises a first reference image acquired by the first reference camera and a second reference image acquired by the second reference camera; the calculating a difference parameter between the first camera and the second camera based on the matched feature pair in the first image and the second image comprises:
calculating a first rotation parameter and a first translation parameter between the first target camera and the first reference camera based on the matched feature point pairs in the first target image and the first reference image;
transforming the second target image based on the first rotation parameter and the first translation parameter;
and calculating a third rotation parameter and a third translation parameter between the first camera and the second camera based on the matched feature point pairs in the transformed second target image and the second reference image, wherein the difference parameter comprises the third rotation parameter and the third translation parameter.
8. The utility model provides an unusual positioner that camera was markd which characterized in that includes:
the first acquisition module is configured to acquire an image acquired by a first camera of the terminal on a target object to obtain a first image;
the second acquisition module is configured to acquire an image acquired by a second camera of the reference terminal on the target object to obtain a second image, wherein the second camera is a camera corresponding to the first camera of the terminal in the reference terminal;
a first calculation module configured to calculate a difference parameter between the first camera and the second camera based on the matched feature pairs in the first image and the second image;
a determining module configured to determine an influencing factor of the camera calibration anomaly according to the difference parameter.
9. The utility model provides an unusual positioner that camera was markd which characterized in that includes:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring an image acquired by a first camera of a terminal for a target object to obtain a first image;
acquiring an image acquired by a second camera of a reference terminal to the target object to obtain a second image, wherein the second camera is a camera in the reference terminal corresponding to the first camera of the terminal;
calculating a difference parameter between the first camera and the second camera based on the matched feature pairs in the first image and the second image;
and determining the influence factors of the calibration abnormity of the camera according to the difference parameters.
10. A computer-readable storage medium, on which computer program instructions are stored, which program instructions, when executed by a processor, carry out the steps of the method according to any one of claims 1 to 7.
CN202110838211.2A 2021-07-23 2021-07-23 Abnormal positioning method and device for camera calibration and storage medium Pending CN115690221A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110838211.2A CN115690221A (en) 2021-07-23 2021-07-23 Abnormal positioning method and device for camera calibration and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110838211.2A CN115690221A (en) 2021-07-23 2021-07-23 Abnormal positioning method and device for camera calibration and storage medium

Publications (1)

Publication Number Publication Date
CN115690221A true CN115690221A (en) 2023-02-03

Family

ID=85044375

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110838211.2A Pending CN115690221A (en) 2021-07-23 2021-07-23 Abnormal positioning method and device for camera calibration and storage medium

Country Status (1)

Country Link
CN (1) CN115690221A (en)

Similar Documents

Publication Publication Date Title
CN106651955B (en) Method and device for positioning target object in picture
KR101694643B1 (en) Method, apparatus, device, program, and recording medium for image segmentation
US10452890B2 (en) Fingerprint template input method, device and medium
EP3163498A2 (en) Alarming method and device
CN107944367B (en) Face key point detection method and device
CN106648063B (en) Gesture recognition method and device
EP2975574A2 (en) Method, apparatus and terminal for image retargeting
CN112927122A (en) Watermark removing method, device and storage medium
CN112200040A (en) Occlusion image detection method, device and medium
CN107992894B (en) Image recognition method, image recognition device and computer-readable storage medium
CN108154090B (en) Face recognition method and device
CN107239758B (en) Method and device for positioning key points of human face
CN107292901B (en) Edge detection method and device
CN111047049B (en) Method, device and medium for processing multimedia data based on machine learning model
CN106469446B (en) Depth image segmentation method and segmentation device
CN107122356B (en) Method and device for displaying face value and electronic equipment
CN115690221A (en) Abnormal positioning method and device for camera calibration and storage medium
CN113920083A (en) Image-based size measurement method and device, electronic equipment and storage medium
CN110751223B (en) Image matching method and device, electronic equipment and storage medium
CN108769513B (en) Camera photographing method and device
CN114418865A (en) Image processing method, device, equipment and storage medium
CN113138036B (en) Temperature detection method and device and electronic equipment
CN115937629B (en) Template image updating method, updating device, readable storage medium and chip
CN115731402A (en) Position point determining method and device, electronic equipment and storage medium
CN117597705A (en) Camera calibration method and device and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination