CN107644442B - Spatial position calibration method of double-camera module - Google Patents

Spatial position calibration method of double-camera module Download PDF

Info

Publication number
CN107644442B
CN107644442B CN201610586510.0A CN201610586510A CN107644442B CN 107644442 B CN107644442 B CN 107644442B CN 201610586510 A CN201610586510 A CN 201610586510A CN 107644442 B CN107644442 B CN 107644442B
Authority
CN
China
Prior art keywords
image
module
points
spatial position
mark points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610586510.0A
Other languages
Chinese (zh)
Other versions
CN107644442A (en
Inventor
张胜
廖海龙
马江敏
黄宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Sunny Opotech Co Ltd
Original Assignee
Ningbo Sunny Opotech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Sunny Opotech Co Ltd filed Critical Ningbo Sunny Opotech Co Ltd
Priority to CN201610586510.0A priority Critical patent/CN107644442B/en
Publication of CN107644442A publication Critical patent/CN107644442A/en
Application granted granted Critical
Publication of CN107644442B publication Critical patent/CN107644442B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The spatial position calibration method of the double-camera module comprises the steps that a first module and a second module respectively shoot the same target to obtain a first image and a second image, wherein the target is provided with a plurality of mark points which are arranged at intervals in an array mode; processing the first image and the second image to obtain a clear black and white test image; adaptively locating the landmark points in the first image and the second image, respectively; adaptively matching the landmark points in the first image and the second image, respectively; calibrating the first module and the second module, and; and calculating to obtain the relative spatial position of the double-camera module. The invention can reduce the influence of interference noise and environment brightness on the result, has higher stability, flexibility and precision, and is suitable for batch use in production.

Description

Spatial position calibration method of double-camera module
Technical Field
The invention relates to a spatial position calibration method of a double-camera module, which can obtain the relative spatial position of the double-camera module by adopting a self-adaptive positioning and matching mode.
Background
With the development of technology, various products related to the technology, especially intelligent mobile devices such as mobile phones, tablet computers, PDAs, etc., are developed. It is common for each person to hold more than one smart phone, and the camera included in the smart phone is a main function of the phone, wherein the quality of the camera is even one of the conditions for people to purchase the smart phone. Due to the change of times, people's habits are gradually different, and most people nowadays are more accustomed to using the camera of the mobile phone to replace the conventional camera to record life and simultaneously directly share the life with the internet through the smart phone. It is worth mentioning that the current market of mobile phones still keeps a fast growth trend, especially smart phones grow more rapidly. Also, since the user demands for the image quality is higher and higher, the pursuit of the excellent performance of the image pickup module is a critical technical field for the intense competition of various large mobile phone manufacturers. Particularly, in 2014, the camera module enters the double-shot era, and for the initial double cameras, the main functions are in the 3D display effect and the shooting capability, but the pixels are too low and the image quality is not high, so that a great improvement space is left for the design, production and manufacture of the double cameras.
However, there are many problems in mass production. At present, the relative spatial position of the double-camera module is required to be obtained in the production process of the double-camera module lens. The traditional camera module calibration is a mature technology. The traditional calibration method usually assumes that the corresponding relation of the mark points between different images is already determined, which is difficult for a double-shot module. Most of the conventional matching methods are algorithms for checkerboard patterns, and these methods have many disadvantages. For example, it is sensitive to noise and image blur, and requires a very precise environment to implement. Although the method of adopting the self-identified mark points can realize automatic matching, the method has higher operation difficulty and higher price, and is not beneficial to batch use on a production line by manufacturers.
Disclosure of Invention
The invention mainly aims to provide a space calibration method of a double-camera module, which obtains internal parameters and external parameters of a first module and a second module through self-adaptive positioning and matching, and obtains the relative space position of the double-camera module.
Another objective of the present invention is to provide a spatial calibration method for a bi-camera module, which can achieve fast positioning of the mark point by image recognition technology, and reduce the positioning time.
Another objective of the present invention is to provide a spatial calibration method for a bi-camera module, which uses adaptive positioning and adaptive matching methods to reduce the influence of interference noise and ambient brightness on the result, and has high stability and flexibility.
The invention also aims to provide a spatial calibration method of a double-shooting module, which considers a plurality of distortion factors through image enhancement, filtering, denoising and other processing and has higher precision.
Another objective of the present invention is to provide a spatial calibration method for a bi-camera module, which improves the stability and efficiency of image processing by targeted denoising processing compared with the conventional method.
Another objective of the present invention is to provide a spatial calibration method for a dual camera module, which is simpler than the conventional method, and is suitable for mass production while ensuring accuracy.
To achieve the above object, the present invention provides a method for calibrating a spatial position of a dual camera module, comprising:
the method comprises the following steps that a first module and a second module respectively shoot a same target to obtain a first image and a second image, wherein the target is provided with a plurality of mark points which are arranged at intervals in an array manner;
processing the first image and the second image to obtain a clear black and white test image;
adaptively locating the landmark points in the first image and the second image, respectively;
adaptively matching the landmark points in the first image and the second image, respectively;
calibrating the first module and the second module; and
and calculating to obtain the relative spatial position of the double-shooting module by using the calibration results of the first module and the second module.
In some embodiments, the above method further comprises the step of: and rotating the target to enable the first module and the second module to respectively shoot the first image and the second image of the target at different angles.
In some embodiments, the step of processing the first image and the second image comprises the steps of: and carrying out gray processing and threshold segmentation on the first image and the second image.
In some embodiments, in processing the first image and the second image, further comprising performing image enhancement processing on the first image and the second image to enhance the difference of the brightness of the black-and-white region.
In some embodiments, wherein processing the first image and the second image employs denoising only local test regions of the first image and the second image.
In some embodiments, adaptively locating the landmark points in the first image and the second image, respectively, further comprises preliminarily processing the first image and the second image, image-identifying the landmark point rough locating regions of the first image and the second image, image-precisely identifying the landmark points of the first image and the second image in the rough locating regions of the landmark points, counting parameter values of the landmark points, and establishing coordinates.
In some embodiments, preliminarily processing the first image and the second image further comprises scaling the first image and the second image.
In some embodiments, the counting the parameter values of the marker points includes at least one of counting an area of the marker points, a distance between the marker points, a shape of the marker points, and a contrast of the marker points.
In some embodiments, one of the marker points of the target has an adaptive anchor point for determining the image orientation, wherein the adaptive anchor point is directly identified and located.
In some embodiments, the landmark point having the adaptive anchor point is concentric with the adaptive anchor point and has a color difference.
In some embodiments, the landmark points with the adaptive anchor point appear black and white, respectively.
In some embodiments, adaptively matching the mark points in the first image and the second image further includes searching for adjacent mark points using the adaptive anchor point as an initial point, searching for the mark points that have not been matched again using the searched mark points as an initial point, repeating the previous step until all the mark points are matched, and numbering the mark points as required, so as to establish a one-to-one correspondence relationship between the mark points of the target and the mark points detected in the processed first image and the processed second image.
In some embodiments, the step of calibrating the first module and the second module comprises obtaining an internal parameter and an external parameter of the first module, and obtaining an internal parameter and an external parameter of the second module.
In some embodiments, the internal parameters and the external parameters of the first module are obtained as the rotation matrix R1 of the first module and the spatial position vector T1 of the first module, and the internal parameters and the external parameters of the second module are obtained as the rotation matrix R2 of the second module and the spatial position vector T2 of the second module.
In some embodiments, the relative Rotation matrix Rotation of the dual camera module and the relative spatial position Shift of the dual camera module are obtained by calculation using the calibration results of the first module and the second module.
In some embodiments, the calibration results of the first module and the second module are used, and the calibration results are obtained by the formula Rotation R1×inv(R2) And Shift ═ T1-Rotation×T2And obtaining the relative spatial position of the double camera module.
According to another aspect of the present invention, there is provided a target for spatial position calibration of a dual camera module, which has a plurality of mark points spaced apart from each other and arranged in an array, and at least one of the mark points has an adaptive positioning point therein, which has a color difference with the mark points.
In some embodiments, the marker point is black and the adaptive positioning point is white.
In some embodiments, the marker point is white and the adaptive positioning point is black.
In some embodiments, the landmark point and the adaptive anchor point are in a circle or polygon such as a triangle, square or other polygon.
Drawings
Fig. 1 is a flowchart of an adaptive landmark point locating method of a bi-camera module according to a preferred embodiment of the present invention.
Fig. 2 is a schematic diagram of the target board of the spatial position calibration method of the telephoto module set according to a preferred embodiment of the present invention.
Fig. 3 is a diagram illustrating the effect of adaptive matching of landmark points in the spatial location calibration method of the bi-camera module according to a preferred embodiment of the present invention.
FIG. 4 is a schematic illustration of the target in an alternative mode in accordance with a preferred embodiment of the present invention.
FIG. 5 is a schematic view of the target according to another alternative mode of a preferred embodiment of the present invention.
Fig. 6 is a flowchart of a spatial position calibration method of a camcorder according to a preferred embodiment of the present invention.
Detailed Description
The following description is presented to disclose the invention so as to enable any person skilled in the art to practice the invention. The preferred embodiments in the following description are given by way of example only, and other obvious variations will occur to those skilled in the art. The basic principles of the invention, as defined in the following description, may be applied to other embodiments, variations, modifications, equivalents, and other technical solutions without departing from the spirit and scope of the invention.
It is understood that the terms "a" and "an" should be interpreted as meaning that a number of one element or element is one in one embodiment, while a number of other elements is one in another embodiment, and the terms "a" and "an" should not be interpreted as limiting the number.
In the preferred embodiment, in order to calibrate the relative spatial position of the dual camera module 10, calibration is performed by using adaptive positioning and matching through a target 20. The dual camera module 10 includes a first module 11 and a second module 12. Wherein the target 20 is rotatable and has a plurality of marker dots 30 arranged in an array, the marker dots 30 being spaced apart from each other on the target 20 and preferably black to provide a substantially opaque effect under a light source. As shown in fig. 2, a target 20 of the preferred embodiment is shown. It should be noted that the shape of the mark point 30 is not limited to a circle, and may also be a square, triangle or other polygonal shape.
In the preferred embodiment, the double camera module 10 is disposed on one side of the target 20, so that the first module 11 and the second module 12 can photograph the target 20, and the mark points 30 of the target 20 are photographed. And starting the first module 11 to shoot to obtain at least one first image 110. And starting the second module 12 to shoot to obtain at least one second image 120. By rotating the target 20, the first module 11 and the second module can capture the first image 110 and the second image 120 at different angles, respectively.
The first image 110 and the second image 120 are first gray-scale processed. And performing image processing after obtaining the gray level images of the first image 110 and the second image 120. By means of adaptive detection, the marker points 30 in the image are located. Then, the mark points 30 are subjected to adaptive matching, and the mark points 30 are labeled. And establishing space coordinates by using the matched mark points 30 and the spacing distance of the mark points 30 of the target 20, introducing radial distortion and tangential distortion, and calibrating the first module 11 and the second module 12. Thereby obtaining the internal parameter 111 and the external parameter 112 of the first module 11, and the internal parameter 121 and the external parameter 122 of the second module 12. And completing calibration on the relative spatial position of the double-camera module 10 by using the calibration results of the first module 11 and the second module 12. It should be noted that in the description of the preferred embodiment, the first (second) image includes an image captured by the first (second) module and an image obtained by further processing the image.
In order to ensure the calibration accuracy, in the preferred embodiment, when the dual camera module 10 shoots the target 20, the target 20 is rotated to shoot three images at different angles. That is, the first module 11 obtains three sets of the first images 110, and the second module 12 obtains three sets of the second images 120. Under different implementation conditions, the shooting times can be changed correspondingly. After the gray processing is performed on the obtained first image 110 and the second image 120, in order to ensure the calibration stability, denoising processing such as image enhancement and filtering is performed. It should be noted that the preferred embodiment employs a targeted denoising method. Because the better the denoising effect in image processing, the longer the required operation time, the lower the operation efficiency. And denoising the first image 110 and the second image 120, selecting a partial region required for calculation, and processing the partial region. In a preferred embodiment, the first image 110 and the second image 120 are selected to be 3264 × 2448 pixels, and if the whole image is used for denoising, the operation time is about 2 s. By adopting the targeted denoising method of the preferred embodiment, denoising is performed in a local range, and the operation time is 50 ms. The method obviously reduces the operation time, thereby improving the efficiency of denoising operation and being suitable for application and popularization in batch production.
As shown in fig. 1, the adaptive calibration process for the first image 110 and the second image 120 of the preferred embodiment is as follows. The target 20 is photographed by the first and second modules 11 and 120, respectively, and imaged in the first and second images 110 and 120, respectively. First, the first image 110 and the second image 120 are preliminarily processed, and scaling and fine adjustment of the images are performed. Then, each region where the marker point 30 is located is roughly determined for the first image 110 and the second image 120 by image recognition. Then, the roughly determined region is precisely located. And counting the precisely positioned areas to obtain the area numerical value of the mark points 30 and the mutual distance value between the mark points 30 so as to establish the space coordinate value of the mark points. Finally, the marker point 30 is completely located.
Fig. 2 is a first image 110 captured by the first module 11 in the preferred embodiment. Wherein the marker points 30 in the first image 110 are adaptively calibrated. The first image 110 is initially processed and appropriately scaled down to find the main area of the target 20. By image recognition, each region where the marker point 30 is located is roughly determined for the first image 110, and there are 49 regions as shown in fig. 2. Then, the roughly determined region is precisely located. And counting the precisely positioned areas to obtain the area numerical value of the mark points 30 and the mutual distance value between the mark points 30 so as to establish the space coordinate value of the mark points. Finally, all 49 of the marker points 30 are located.
The step of adaptively matching the landmark points 30 in the first image 110 and the second image 120 is as follows. It should be noted that one of the marker points 30 in the plurality of marker points 30 of the target 20, as shown in the lower left corner of fig. 2, has an adaptive positioning point 31 to determine the image direction, and the adaptive positioning point 31 can be directly identified and positioned. The reason why the adaptive positioning point 31 is directly identified includes, but is not limited to, that the adaptive positioning point 31 has a portion with a different contrast from the other mark points 30, the area of the adaptive positioning point 31 has a significant difference from the other mark points 30, and the shape of the adaptive positioning point 31 has a significant difference from the other mark points 30. For example, in fig. 2, the adaptive positioning point 31 is arranged concentrically with the marker point 30 and has a color difference, such as the marker point 30 is black and the adaptive positioning point 31 is white, or conversely, the marker point 30 is white and the adaptive positioning point 31 is black, and more specifically, may be arranged in a concentric circle shape.
Firstly, the adaptive anchor point 31 is taken as a starting point, and the number is a. Starting from the adaptive positioning point 31, searching two marker points 30 with the shortest distance, defining one of the two marker points as a number b, and recording the direction relationship between the number b and the number a on the coordinate. And continuously searching the two mark points 30 with the shortest distance by using the mark point 30 with the number b, and taking the mark point 30 with the same direction relation as the number c. When all the marker points 30 are matched in this direction relationship, the search for the marker points 30 that have not been matched is started, and the search is started in the direction opposite to the direction relationship. And numbering the mark points 30 which are not matched until the matching is completed. When all the mark points 30 are matched, the mark points 30 are numbered in sequence to facilitate subsequent operations.
Fig. 3 shows the adaptive matching effect of the preferred embodiment on the first image 110. It is worth mentioning that an adaptive positioning point 31 is made in the mark points 30 of the target 20 to determine the image direction. The adaptive positioning point 31 has a part with a different contrast ratio from the other marker points 30, and the adaptive positioning point 31 can be directly identified and positioned, and the adaptive positioning point is located at the lower left corner of the target 20. Firstly, the adaptive anchor point 31 is taken as a starting point, and the number is a. Starting from the adaptive positioning point 31, two of the marker points 30 with the shortest distance are searched, and one of the marker points in the vertical direction is defined as a number b. And continuously searching the two mark points 30 with the shortest distance by using the mark point 30 with the number b, and taking the mark point 30 in the vertical direction as the number c. When all the mark points 30 are matched in the vertical direction, the search for the mark points 30 which have not been matched is started, and the search is started in the horizontal direction. And sequentially numbering the mark points 30 which are not matched until the matching is completed. When all the marker points 30 are matched, the marker points 30 are numbered sequentially. As shown in fig. 3, 49 of the mark points 30 on the target 20 are numbered 1 to 49 in sequence to facilitate subsequent operations.
Fig. 4 illustrates an alternative target 20, wherein the adaptive positioning point 31 is located at the upper left corner of the target 20. The adaptive positioning point 31 can be directly identified and positioned because the center has portions with different contrasts. Firstly, the adaptive anchor point 31 is taken as a starting point, and the number is a. Starting from the adaptive positioning point 31, two of the marker points 30 with the shortest distance are searched, and one of the marker points in the horizontal direction is defined as a number b. And continuing to search the two mark points 30 with the shortest distance from the mark point 30 with the number b, and taking the mark point 30 in the horizontal direction as the number c. When all the mark points 30 are matched in the horizontal direction, the search for the mark points 30 that have not been matched is started, and the search is started in the vertical direction. After the unmatched mark points 30 are found, the search is started in the vertical direction. And analogizing, searching the marker points 30 which are not matched in an S shape until the matching is completed. When all the marker points 30 are matched, the marker points 30 are numbered 1 to 49 in sequence.
FIG. 5 illustrates another alternative target 20, wherein the adaptive positioning point 31 is located at the center of the target 20. The adaptive positioning point 31 can be directly identified and positioned because the center has portions with different contrasts. Firstly, the adaptive anchor point 31 is taken as a starting point, and the number is a. Starting from the adaptive positioning point 31, four of the marking points 30 with the shortest distance are searched, and one of the marking points in the transverse direction is defined as a number b. And continuously searching the four mark points 30 with the shortest distance by using the mark points 30 with the number b, and taking the mark point 30 in the vertical direction as the number c. By analogy, in a spiral shape, the search for the marker points 30 that have not been matched is started until the matching is completed. When all the marker points 30 are matched, the marker points 30 are numbered 1 to 49 in sequence.
According to the adaptive positioning of the first image 110 and the second image 120 and the result of matching the landmark points 30, the first module 11 and the second module 12 can be calibrated respectively. Establishing the space coordinates of the mark points 30 according to the actual distance values of the mark points 30 of the target 20, introducing the radial distortion and the tangential distortion of the first module 11 and the second module 12 by combining the image coordinates corresponding to the mark points 30 obtained by matching, and calibrating the left camera and the right camera respectively. The internal parameters 111 and the external parameters 112 of the first module 11, and the internal parameters 121 and the external parameters 122 of the second module 12 are obtained. And obtaining the relative spatial position value of the double module 10 through calculation according to the internal parameter 111 and the external parameter 112 of the first module 11, and the internal parameter 121 and the external parameter 122 of the second module 12.
In the preferred embodiment, the first module 11 and the second module 12 respectively capture three sets of the first image 110 and the second image 120 for the target 20 at different angles. Establishing the space coordinates of the mark points 30 according to the actual distance values of the mark points 30 of the target 20, and combining the mark obtained by matchingAnd calibrating the left camera and the right camera respectively according to the image coordinates corresponding to the point 30. Obtaining the internal parameters 111 and the external parameters 112 of the first module 11, and the internal parameters 121 and the external parameters 122 of the second module 12, i.e. R1And T1And R2And T2。R1A rotation matrix, R, representing said first module 112A rotation matrix, T, representing said second module 121A spatial position vector, T, representing said first module 112Representing a spatial position vector of said second module 12. Using the formula: rotation ═ R1×inv(R2) And Shift ═ T1-Rotation×T2Where Rotation denotes a relative Rotation matrix of the dual camera module 10, and Shift denotes a relative spatial position of the dual camera module 10.
The flow of the method for calibrating the relative spatial position of the dual camera module 10 according to the preferred embodiment is shown in fig. 6. And starting the first module 11 to shoot to obtain at least one first image 110. And starting the second module 12 to shoot to obtain at least one second image 120. By rotating the target 20, the first module 11 and the second module can capture the first image 110 and the second image 120 at different angles, respectively. The first image 110 and the second image 120 are first gray-scale processed. And performing image processing after obtaining the gray level images of the first image 110 and the second image 120. By means of adaptive detection, the marker points 30 in the image are located. Then, the mark points 30 are subjected to adaptive matching, and the mark points 30 are labeled. And establishing space coordinates by using the matched mark points 30 and the spacing distance of the mark points 30 of the target 20, introducing radial distortion and tangential distortion, and calibrating the first module 11 and the second module 12. Thereby obtaining the internal parameter 111 and the external parameter 112 of the first module 11, and the internal parameter 121 and the external parameter 122 of the second module 12. And completing calibration on the relative spatial position of the double-camera module 10 by using the calibration results of the first module 11 and the second module 12.
In the preferred embodiment, when the dual camera module 10 shoots the target 20, the target 20 is rotated to shoot three images at different angles. That is, the first module 11 obtains three sets of the first images 110, and the second module 12 obtains three sets of the second images 120. And denoising the first image 110 and the second image 120, selecting a partial region required for calculation, and processing the partial region. In a preferred embodiment, the first image 110 and the second image 120 are selected to be 3264 × 2448 pixels, and the local denoising is performed in a local range by using the targeted denoising method of the preferred embodiment, with a computation time of 50 ms.
Next, the first module 11 captures the obtained first image 110. Wherein the marker points 30 in the first image 110 are adaptively calibrated. The first image 110 is initially processed and appropriately scaled down to find the main area of the target 20. Through image recognition, each region where the marker point 30 is located is roughly determined for the first image 110, and there are 49 regions in total. And precisely positioning the roughly determined area. And counting the precisely positioned areas to obtain the area numerical value of the mark points 30 and the mutual distance value between the mark points 30 so as to establish the space coordinate value of the mark points. Finally, all 49 of the marker points 30 are located. It is worth mentioning that an adaptive positioning point 31 is made in the mark points 30 of the target 20 to determine the image direction. And the adaptive positioning point 31 can be directly identified and positioned, and the adaptive positioning point is positioned at the lower left corner of the target 20. Firstly, the adaptive anchor point 31 is taken as a starting point, and the number is a. Starting from the adaptive positioning point 31, two of the marker points 30 with the shortest distance are searched, and one of the marker points in the vertical direction is defined as a number b. And continuously searching the two mark points 30 with the shortest distance by using the mark point 30 with the number b, and taking the mark point 30 in the vertical direction as the number c. When all the mark points 30 are matched in the vertical direction, the search for the mark points 30 which have not been matched is started, and the search is started in the horizontal direction. And sequentially numbering the mark points 30 which are not matched until the matching is completed. When all the marker points 30 are matched, the marker points 30 are numbered sequentially. The 49 marker points 30 on the target 20 are numbered 1 to 49 in sequence to facilitate subsequent operations.
In the preferred embodiment, the image coordinates corresponding to the mark points 30 obtained by matching are combined to calibrate the left and right cameras respectively. Obtaining the internal parameters 111 and the external parameters 112 of the first module 11, and the internal parameters 121 and the external parameters 122 of the second module 12, i.e. R1And T1And R2And T2。R1A rotation matrix, R, representing said first module 112A rotation matrix, T, representing said second module 121A spatial position vector, T, representing said first module 112Representing a spatial position vector of said second module 12. Using the formula: rotation ═ R1×inv(R2) And Shift ═ T1-Rotation×T2Where Rotation denotes a relative Rotation matrix of the dual camera module 10, and Shift denotes a relative spatial position of the dual camera module 10.
It will be appreciated by persons skilled in the art that the embodiments of the invention described above and shown in the drawings are given by way of example only and are not limiting of the invention. The objects of the invention have been fully and effectively accomplished. The functional and structural principles of the present invention have been shown and described in the examples, and any variations or modifications of the embodiments of the present invention may be made without departing from the principles.

Claims (18)

1. The spatial position calibration method of a double-camera module comprises the following steps:
the method comprises the following steps that a first module and a second module respectively shoot a same target to obtain a first image and a second image, wherein the target is provided with a plurality of mark points which are arranged at intervals in an array manner;
processing the first image and the second image to obtain a clear black and white test image;
adaptively locating the landmark points in the first image and the second image, respectively;
adaptively matching the landmark points in the first image and the second image, respectively;
calibrating the first module and the second module; and
calculating to obtain the relative spatial position of the double-camera module by using the calibration results of the first module and the second module;
wherein the step of adaptively locating the landmark points in the first image and the second image, respectively, further comprises the steps of:
scaling the first image and the second image, respectively;
image-identifying the landmark point rough localization areas of the first image and the second image;
carrying out image accurate identification on the mark points of the first image and the second image in the roughly positioning area of the mark points; and
establishing space coordinates of the mark points by determining mutual spacing values of the mark points in a mode of obtaining area numerical values of the mark points through each accurately positioned area;
wherein one of the landmark points of the reticle has an adaptive anchor point for determining image orientation, wherein the adaptive anchor point is directly identified and located, wherein the step of adaptively matching the landmark points in the first image and the second image, respectively, further comprises: searching the adjacent mark points by taking the self-adaptive positioning point as an initial point, searching the unmatched mark points by taking the searched mark points as an initial point, repeating the previous step until all the mark points are matched, and numbering the mark points according to requirements, thereby establishing a one-to-one correspondence relationship between the mark points detected in the first image and the second image after processing and the mark points of the target.
2. The spatial position calibration method of a telephoto module set according to claim 1, further comprising the steps of: and rotating the target to enable the first module and the second module to respectively shoot the first image and the second image of the target at different angles.
3. The spatial position calibration method of a bi-camera module set according to claim 1, wherein the step of processing the first image and the second image comprises the steps of: and carrying out gray processing and threshold segmentation on the first image and the second image.
4. The spatial position calibration method of a bi-camera module set according to claim 3, wherein in the processing of the first image and the second image, further comprising performing image enhancement processing on the first image and the second image to enhance the difference of the brightness of the black and white regions.
5. The spatial position calibration method of a bi-camera module set according to claim 2, wherein in processing the first image and the second image, denoising processing is performed only on a local test region of the first image and the second image.
6. The spatial position calibration method of a bi-camera module set according to claim 3, wherein in processing the first image and the second image, denoising processing is performed only on a local test region of the first image and the second image.
7. The spatial position calibration method of a telephoto module set according to claim 1, wherein the statistics of the parameter values of the landmark points includes at least one of statistics of an area of the landmark points, a distance between the landmark points, a shape of the landmark points, and a contrast of the landmark points.
8. The spatial position calibration method of a telephoto module set according to any one of claims 1 to 7, wherein the landmark point having the adaptive positioning point is concentric with the adaptive positioning point and has a color difference.
9. The spatial position calibration method of a telephoto module according to claim 8, wherein the mark point having the adaptive positioning point and the adaptive positioning point are respectively black and white.
10. The spatial position calibration method of a telephoto module set according to any one of claims 1 to 7, wherein the step of calibrating the first module set and the second module set includes: and obtaining the internal parameters and the external parameters of the first module, and obtaining the internal parameters and the external parameters of the second module.
11. The spatial position calibration method for a bi-camera module set according to claim 10, wherein the internal parameter and the external parameter of the first module set are obtained, wherein the internal parameter and the external parameter of the first module set are the rotation matrix R of the first module set1And a spatial position vector T of the first module1And obtaining the internal parameters and the external parameters of the second module, wherein the internal parameters and the external parameters of the second module are the rotation matrix R of the second module2And a spatial position vector T of the second module2
12. The spatial position calibration method of a bi-camera module set according to claim 11, wherein the calibration results of the first module set and the second module set are used to calculate the relative Rotation matrix Rotation of the bi-camera module set and the relative spatial position Shift of the bi-camera module set.
13. The spatial position calibration method of a telephoto module set according to claim 12, wherein the calibration results of the first and second module sets are used, via a formula
Figure DEST_PATH_IMAGE002
And
Figure DEST_PATH_IMAGE004
and obtaining the relative spatial position of the double camera module.
14. A target board for calibrating the spatial position of a dual camera module is provided, which has a plurality of mark points arranged in an array at intervals, and at least one of the mark points has an adaptive positioning point with color difference with the mark points, wherein the target board is used for calibrating the spatial position of the dual camera module according to a calibration method, wherein the calibration method comprises the following steps:
the first module and the second module respectively shoot the same target to obtain a first image and a second image;
processing the first image and the second image to obtain a clear black and white test image;
adaptively locating the landmark points in the first image and the second image, respectively;
adaptively matching the landmark points in the first image and the second image, respectively;
calibrating the first module and the second module; and
calculating to obtain the relative spatial position of the double-camera module by using the calibration results of the first module and the second module;
wherein the step of adaptively locating the landmark points in the first image and the second image, respectively, further comprises the steps of:
scaling the first image and the second image, respectively;
image-identifying the landmark point rough localization areas of the first image and the second image;
carrying out image accurate identification on the mark points of the first image and the second image in the roughly positioning area of the mark points; and
establishing space coordinates of the mark points by determining mutual spacing values of the mark points in a mode of obtaining area numerical values of the mark points through each accurately positioned area;
wherein the adaptive anchor points are used for determining image orientation, wherein the adaptive anchor points are directly identified and located, wherein the step of adaptively matching the landmark points in the first image and the second image, respectively, further comprises: searching the adjacent mark points by taking the self-adaptive positioning point as an initial point, searching the unmatched mark points by taking the searched mark points as an initial point, repeating the previous step until all the mark points are matched, and numbering the mark points according to requirements, thereby establishing a one-to-one correspondence relationship between the mark points detected in the first image and the second image after processing and the mark points of the target.
15. The target of claim 14, wherein the marker points are black and the adaptive positioning points are white.
16. The target of claim 14, wherein the marker points are white and the adaptive positioning points are black.
17. The target of claim 14, wherein the landmark point that encompasses the adaptive anchor point is concentrically located with the adaptive anchor point.
18. The target of claim 14, wherein the landmark points and the adaptive anchor points are circular or polygonal.
CN201610586510.0A 2016-07-21 2016-07-21 Spatial position calibration method of double-camera module Active CN107644442B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610586510.0A CN107644442B (en) 2016-07-21 2016-07-21 Spatial position calibration method of double-camera module

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610586510.0A CN107644442B (en) 2016-07-21 2016-07-21 Spatial position calibration method of double-camera module

Publications (2)

Publication Number Publication Date
CN107644442A CN107644442A (en) 2018-01-30
CN107644442B true CN107644442B (en) 2021-10-15

Family

ID=61109353

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610586510.0A Active CN107644442B (en) 2016-07-21 2016-07-21 Spatial position calibration method of double-camera module

Country Status (1)

Country Link
CN (1) CN107644442B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108881898A (en) * 2018-06-07 2018-11-23 歌尔股份有限公司 The test method of depth of field mould group nonlinear calibration
CN111256953B (en) * 2018-12-03 2022-03-29 宁波舜宇光电信息有限公司 Array module optical axis detection system and method thereof
CN111383277B (en) * 2018-12-29 2023-05-19 余姚舜宇智能光学技术有限公司 Wide-interval double-camera module AA method and system

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6816629B2 (en) * 2001-09-07 2004-11-09 Realty Mapping Llc Method and system for 3-D content creation
US20060171452A1 (en) * 2005-01-31 2006-08-03 Waehner Glenn C Method and apparatus for dual mode digital video recording
CN100384220C (en) * 2006-01-17 2008-04-23 东南大学 Video camera rating data collecting method and its rating plate
CN100573586C (en) * 2008-02-21 2009-12-23 南京航空航天大学 A kind of scaling method of binocular three-dimensional measuring system
KR101694820B1 (en) * 2010-05-07 2017-01-23 삼성전자주식회사 Method and apparatus of recognizing location of user
CN104408704A (en) * 2014-08-25 2015-03-11 太仓中科信息技术研究院 Automatic zoom lens calibration device and calibration method
CN105488779A (en) * 2014-09-18 2016-04-13 宝山钢铁股份有限公司 Camera distortion correction calibration board and calibration method
CN104537659B (en) * 2014-12-23 2017-10-27 金鹏电子信息机器有限公司 The automatic calibration method and system of twin camera
CN104867160B (en) * 2015-06-17 2017-11-07 合肥工业大学 A kind of directionality demarcation target demarcated for camera interior and exterior parameter
CN105180905B (en) * 2015-07-23 2018-03-02 陕西科技大学 A kind of double camera vision positioning system and method
CN105096324B (en) * 2015-07-31 2017-11-28 深圳市大疆创新科技有限公司 A kind of camera device scaling method and camera device
CN105374044B (en) * 2015-12-04 2018-06-01 中国科学院光电技术研究所 A kind of automatic calibration method of light-field camera

Also Published As

Publication number Publication date
CN107644442A (en) 2018-01-30

Similar Documents

Publication Publication Date Title
WO2021138995A1 (en) Fully automatic detection method for checkerboard corners
CN110717942B (en) Image processing method and device, electronic equipment and computer readable storage medium
KR101121034B1 (en) System and method for obtaining camera parameters from multiple images and computer program products thereof
JP4868530B2 (en) Image recognition device
CN108007388A (en) A kind of turntable angle high precision online measuring method based on machine vision
CN109345597B (en) Camera calibration image acquisition method and device based on augmented reality
CN105474064B (en) Camera device and focusing control method
WO2015197019A1 (en) Method and system for measuring lens distortion
JP2003244521A (en) Information processing method and apparatus, and recording medium
TWI763920B (en) Auto white balance method performed by an image signal processor
CN107644442B (en) Spatial position calibration method of double-camera module
JP2016151955A (en) Image processing apparatus, imaging device, distance measuring device, and image processing method
CN108986129B (en) Calibration plate detection method
CN112907580B (en) Image feature extraction and matching algorithm applied to comprehensive dotted line features in weak texture scene
CN110519485A (en) Image processing method, device, storage medium and electronic equipment
CN110136048B (en) Image registration method and system, storage medium and terminal
TW201814240A (en) Method of length measurement for 2D photography
JP4296617B2 (en) Image processing apparatus, image processing method, and recording medium
KR20160001868A (en) Method for calibrating distortion of image in camera
CN105758337B (en) A method of obtaining angle between lens plane and image sensor plane
JP2011147079A (en) Image pickup device
WO2024012463A1 (en) Positioning method and apparatus
CN117576219A (en) Camera calibration equipment and calibration method for single shot image of large wide-angle fish-eye lens
CN106576140B (en) Image processing apparatus and photographic device
CN114926417B (en) Microscopic imaging detection method and system for weak scratches on surface of high-density flexible FPC

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant