CN113115017A - 3D imaging module parameter inspection method and 3D imaging device - Google Patents

3D imaging module parameter inspection method and 3D imaging device Download PDF

Info

Publication number
CN113115017A
CN113115017A CN202110246843.XA CN202110246843A CN113115017A CN 113115017 A CN113115017 A CN 113115017A CN 202110246843 A CN202110246843 A CN 202110246843A CN 113115017 A CN113115017 A CN 113115017A
Authority
CN
China
Prior art keywords
reference plane
imaging module
parameters
lens
array
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110246843.XA
Other languages
Chinese (zh)
Other versions
CN113115017B (en
Inventor
胡洪伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Opnous Smart Sensing & Ai Technology
Original Assignee
Opnous Smart Sensing & Ai Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Opnous Smart Sensing & Ai Technology filed Critical Opnous Smart Sensing & Ai Technology
Priority to CN202110246843.XA priority Critical patent/CN113115017B/en
Publication of CN113115017A publication Critical patent/CN113115017A/en
Application granted granted Critical
Publication of CN113115017B publication Critical patent/CN113115017B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/122Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals

Abstract

The application discloses 3D imaging module parameter inspection method, 3D imaging module includes the sensing array, the method includes the following step: assembling a 3D imaging module to be inspected to a reference plane, wherein the surface of the reference plane is flat, and the lens internal reference of the 3D imaging module and the position parameters relative to the reference plane are known; using the 3D imaging module to perform depth detection on the reference plane, and acquiring the depth value of each block of the reference plane; respectively corresponding each row of array elements in the sensing array to each block in the reference plane, and determining the depth value of the block corresponding to each row of array elements; calculating and obtaining calculation parameters of the 3D imaging module according to the depth value of the block and the space geometric relationship between the block and the array element corresponding to the block; and judging whether the lens internal parameters and the position parameters of the 3D imaging module are accurate or not according to the calculation parameters.

Description

3D imaging module parameter inspection method and 3D imaging device
Technical Field
The application relates to the field of 3D imaging, in particular to a 3D imaging module parameter inspection method and a 3D imaging device.
Background
The 3D imaging module is used for carrying out 3D imaging calculation by utilizing the assembly information in the process of carrying out 3D imaging. The assembly information includes the relative position relationship of the 3D imaging module with respect to the scene to be imaged and the like after the 3D imaging module is assembled to the application scene, and the accuracy of these parameters may affect the final output 3D image.
In the prior art, a ToF module is often used for 3D imaging. The ToF module, also known as a Time of Flight (ToF) module, can measure a distance, a three-dimensional structure, or a three-dimensional profile of an object to be measured by a pulse signal transmitted from a sensor and a Time interval from transmission to reception or a phase generated by laser light once traveling back and forth to the object to be measured. The ToF sensor can simultaneously obtain a gray image and a distance image, and is widely applied to the fields of somatosensory control, behavior analysis, monitoring, automatic driving, artificial intelligence, machine vision, automatic 3D modeling and the like.
People usually use a 3D imaging module to capture a depth image or a grayscale image of a region to be measured, so as to analyze the region to be measured, and perform subsequent planning, such as path planning. Therefore, the accuracy of the image captured by the 3D imaging module is very important.
It is desirable to provide a technique that can improve the accuracy of images captured by a 3D imaging module.
Disclosure of Invention
In view of this, the present application provides a 3D imaging module parameter checking method and a 3D imaging device, which can improve the accuracy of an image captured by the 3D imaging module.
The application provides a 3D imaging module parameter inspection method, 3D imaging module includes the sensing array, the method includes the following step:
assembling a 3D imaging module to be inspected to a reference plane, wherein the surface of the reference plane is flat, and the lens internal reference of the 3D imaging module and the position parameters relative to the reference plane are known;
using the 3D imaging module to perform depth detection on the reference plane, and acquiring the depth value of each block of the reference plane;
respectively corresponding each row of array elements in the sensing array to each block in the reference plane, and determining the depth value of the block corresponding to each row of array elements;
calculating and obtaining calculation parameters of the 3D imaging module according to the depth value of the block and the space geometric relationship between the block and the array element corresponding to the block;
and judging whether the lens internal parameters and the position parameters of the 3D imaging module are accurate or not according to the calculation parameters.
Optionally, the 3D imaging module includes an optical device, the optical device includes a lens, a center of the lens is equal to a height of a center line in the sensing array with respect to the reference plane, the lens internal parameter includes a theoretical focal length of the lens, and the position parameter includes a theoretical height of the center line of the sensing array with respect to the reference plane.
Optionally, the calculated height of the row array element in the sensing array corresponding to the block x in the sensing array relative to the reference plane is obtained according to the following equation:
Figure BDA0002964390840000021
wherein Y is a calculated height of a center row of the sensor array relative to the reference plane, depth1 is a depth value of the tile x, V is a theoretical height of the tile x relative to the center row, V0 is a theoretical height of the center row of the sensor array relative to the reference plane, and fy is a theoretical focal length of the lens.
Optionally, when determining whether the lens internal reference and the position parameter of the 3D imaging module are accurate according to the calculation parameter, the method includes the following steps:
obtaining a difference function of the calculation parameters and the theoretical height of the center row of the sensing array compared with the reference plane;
and judging whether the lens internal parameters and the position parameters are accurate or not according to the difference function.
Optionally, the difference function is obtained according to the following equation:
Figure BDA0002964390840000022
wherein delta1(V) is the difference function, depth1 is the depth value of the tile x of the reference plane, V is the theoretical height of the tile x relative to the center line, V0 is the theoretical height of the center line of the sensing array relative to the reference plane, fy is the theoretical focal length of the lens, and H is the theoretical height of the center of the lens relative to the reference plane.
Optionally, when determining whether the lens internal reference is accurate according to the difference function, the method includes the following steps:
and acquiring the slope of the difference function, and judging that the theoretical focal length of the lens is accurate if the slope is within a first preset range, or else, judging that the theoretical focal length of the lens is inaccurate.
Optionally, when determining whether the position parameter is accurate according to the difference function, the method includes the following steps:
obtaining the difference value of each row of array elements through the difference function, and obtaining the average number of the difference values of each row of array elements;
and judging whether the theoretical height of the central row of the sensing array compared with the reference plane is accurate or not according to the average of the difference values.
Optionally, if the average value of the difference values is smaller than a second preset threshold, it is determined that the theoretical height of the central row of the sensor array compared with the reference plane is accurate, otherwise, it is not accurate.
Optionally, the plane where the array elements of the sensing array are located is perpendicular to the reference plane, and the 3D imaging module parameter inspection is performed at least twice, where each row of array elements in the sensing array is parallel to the reference plane when there is at least one inspection, and each array element in the sensing array is parallel to the reference plane when there is at least one inspection.
The present application further provides a 3D imaging apparatus, comprising:
the 3D imaging module is used for acquiring depth information of the area to be detected;
the controller is connected to the 3D imaging module and used for executing the 3D imaging module parameter inspection method according to the depth information;
and the memory stores control program codes which can be operated by the controller, and the control program codes are used for realizing the 3D imaging module parameter inspection method.
Optionally, the method further includes:
and the input and output device is connected to the memory and is used for a user to input the lens internal parameters of the 3D imaging module and the position parameters relative to the reference plane into the memory.
The 3D imaging module parameter inspection method and the 3D imaging device solve the problem that the accuracy of the camera lens internal parameters and the position parameters of the 3D imaging module is high, and the 3D imaging module can take out the calculation parameters of the 3D imaging module according to the corresponding relation between each array element in the sensing array of the 3D imaging module and the block on the reference plane, the depth value of the checked block, the camera lens internal parameters of the 3D imaging module and the position relation relative to the reference plane, so that the 3D imaging module can take out more accurate depth images.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flow chart illustrating steps of the method for checking parameters of a 3D imaging module according to an embodiment.
Fig. 2 is a schematic diagram illustrating the 3D imaging module performing parameter verification according to an embodiment.
Fig. 3 is a schematic flow chart illustrating steps of the method for checking parameters of a 3D imaging module according to an embodiment.
Fig. 4 is a schematic connection diagram of the 3D imaging device in an embodiment.
Detailed Description
The inventor researches and discovers that when determining the corresponding blocks of each pixel point of the depth image, the parameters of the 3D imaging module needed to be used include the height of the central row/column of the sensing array of the 3D imaging module relative to the plane to which the sensing array is mounted and the focal length of the lens in the 3D imaging module. Therefore, the accuracy of the depth image can be improved by checking the accuracy of the relevant parameters to determine the correction method.
In order to overcome the above problems, the inventor proposes a method for checking parameters of a 3D imaging module, please refer to fig. 1, wherein the method for checking parameters of a 3D imaging module comprises the following steps:
step S101: and assembling the 3D imaging module to be inspected to a reference plane, wherein the surface of the reference plane is flat, and the lens internal reference of the 3D imaging module and the position parameter relative to the reference plane are known.
The 3D imaging module comprises a ToF module. The ToF module comprises a ToF emitting unit, a sensing array, an optical device and a lens. The ToF emitting unit is used for emitting detection optical signals, the sensing array is used for receiving reflected optical signals reflected by the outside, and the optical device is arranged in the light emitting direction of the ToF emitting unit and used for increasing the detection efficiency of the optical signals. The 3D imaging module can acquire a depth image of the reference plane according to the detection optical signal and the reflected optical signal, and determine the depth value of each block on the reference plane.
And in the sensing array, sensor units are used as array elements to acquire the depth value information, and each sensor unit may image one pixel in the depth image correspondingly.
The optics include a lens having a center on the same axis as a center of the sensing array. In fact, the lens center and the center of the sensing array may not be arranged on the same axis.
The reference plane surface is flat and free of depressions and protrusions. The reference plane is used for carrying out parameter calibration of the 3D imaging module, and the influence of the recess and the protrusion on a calibration result can be eliminated.
When the 3D imaging module is subjected to parameter inspection, the inspected parameters comprise the theoretical focal length of the optical device and the theoretical height of the central line of the sensing array.
Please refer to fig. 2, which is a schematic diagram of the 3D imaging module assembled on the reference plane M. In the embodiment shown in fig. 2, the central row of the sensing array 301 is as high as the central point of the lens 201 of the 3D imaging module, and therefore, the Height V0 of the central row of the sensing array 301 is the same as the Height ToF _ Height from the ground of the center of the lens 201.
In this step S101, the height of the 3D imaging module relative to the reference plane is known, the center of the lens 201 of the 3D imaging module, and the height of the center row of the sensing array 301 relative to the reference plane are also known.
Step S102: and performing depth detection on the reference plane by using the 3D imaging module to obtain the depth value of each block of the reference plane.
Step S103: and respectively corresponding each row of array elements in the sensing array 301 to each block in the reference plane, and determining the depth value of the block corresponding to each row of array elements.
Based on the pinhole imaging principle and the triangle similarity principle, a first row of array elements with the height difference of V1 from the middle row of array elements in the sensing array 301 corresponds to a b block in the reference plane, a second row of array elements with the height difference of V2 from the middle row of array elements corresponds to a c block in the reference plane, and the depth values of the b block and the depth values of the c block are obtained as depth1 and 2.
Step S104: and calculating and obtaining the calculation parameters of the 3D imaging module according to the depth value of the block and the space geometric relationship between the block and the array element corresponding to the block.
Specifically, based on the principle of pinhole imaging and the principle similar to triangle, depth1/fy is ToF _ Height/V1, and V1 is the difference between the theoretical Height V of the row of array elements corresponding to the b block compared with the reference plane and the theoretical Height V0 of the center row compared with the reference plane, in this case, ToF _ Height is (V1 × depth1)/fy is [ (V-V0) × depth1 ]/fy.
The ToF _ Height is the calculated Height of the center row and V0 is the theoretical Height of the center row.
The calculated height of the center row of the sense array 301 is thus obtained according to the following equation:
Figure BDA0002964390840000061
where Y is the calculated height of the central row in the sensor array 301 compared to the reference plane, depth1 is the depth value of the tile x, V is the theoretical height of the tile x relative to the central row, V0 is the theoretical height of the central row of the sensor array relative to the reference plane, and fy is the theoretical focal length of the lens 201.
Step S105: and judging whether the lens internal parameters and the position parameters of the 3D imaging module are accurate or not according to the calculation parameters.
When determining whether the lens internal parameters and the position parameters of the 3D imaging module are accurate according to the comparison result, please refer to fig. 3, which includes the following steps:
step S1051: and acquiring a difference function of the calculation parameters and the theoretical height of the center row of the sensing array compared with the reference plane.
The difference function is obtained according to the following equation:
Figure BDA0002964390840000062
wherein delta1(V) is the difference function, depth1 is the depth value of the tile x of the reference plane, V is the theoretical height of the tile x relative to the center line, V0 is the theoretical height of the center line of the sensing array relative to the reference plane, fy is the theoretical focal length of the lens, and H is the theoretical height of the lens center relative to the reference plane.
Step S1052: and judging whether the lens internal parameters and the position parameters are accurate or not according to the difference function.
The delta1(V) can be simplified to delta1(V) ═ Y/H) -1, which is a function of the difference between the ratio of the calculated height to the theoretical height of the center row of the sensor array and 1. Since the center of the lens is as high as the center row of the sensor array compared to the reference plane, the ratio of the measured height of the center row to the theoretical height is expressed as the ratio of the measured height of the center row to the theoretical height of the center of the lens in this formula.
If the calculated height of the center line is equal to the theoretical height, the difference function is-1, if the calculated height of the center line is greater than the theoretical height, the difference function is greater than 0, and if the calculated height of the center line is less than the theoretical height, the difference function is less than 0.
By the slope of the difference function, a comparison of the calculated height of the center row with the theoretical height can be obtained. Moreover, since the slope depth1/(fy × H) of delta1(V), where H is the theoretical height of the center of the lens, can be directly measured, and depth1 is also a measured value, the slope is mainly determined by the focal length fy.
Therefore, when judging whether the theoretical parameter is accurate according to the difference function, the method comprises the following steps: and acquiring the slope of the difference function, and judging that the theoretical focal length of the lens is accurate if the slope is within a first preset range, or else, judging that the theoretical focal length of the lens is inaccurate.
Specifically, when the slope of the curve of the difference function delta1 is within a first preset range, the focal length fy is considered to be satisfactory. And if the slope of the curve is greater than or equal to the first preset threshold, the theoretical focal length fy of the lens is considered to be too small, and corresponding correction is needed.
When judging whether the theoretical parameter is accurate according to the difference function, the method further comprises the following steps:
obtaining the difference value of each row of array elements through the difference function, and obtaining the average number of the difference values of each row of array elements;
and judging whether the theoretical height of the central row of the sensing array compared with the reference plane is accurate or not according to the average of the difference values.
And if the average value of the difference values is smaller than a second preset threshold value, judging that the theoretical height of the central row of the sensing array is accurate compared with the theoretical height of the reference plane, otherwise, judging that the theoretical height is inaccurate.
Specifically, the heights of the array elements in each row in the sensor array, compared with the height of the reference plane, are sequentially substituted into the difference function delta1(V), so as to obtain the difference values corresponding to the array elements in each row, sum the difference values of the array elements in each row, and divide the sum by the number of rows of the array elements, so as to obtain the average value of the difference values of the array elements in each row. And if the average value is smaller than a second preset threshold value, the theoretical height of the central row of the sensing array is considered to be in accordance with the standard, otherwise, the theoretical height is considered to be not in accordance with the standard.
When summing the various difference functions, the summation is according to the following equation:
Figure BDA0002964390840000081
where sum is the sum of the difference functions, V0 is the theoretical height of the center row of the sense array relative to the reference plane, Vmax is the theoretical height of the highest row of the sense array relative to the reference plane, and the theoretical height of each row array between V0 and Vmax relative to the reference plane will be calculated as V substituted in the sum equation; depth1 is the depth value of each tile, fy is the theoretical focal length of the lens, and H is the theoretical height of the center of the lens relative to the reference plane.
In this embodiment, since the parameters to be tested are embodied when the sensor array is located at different positions, at least two times of 3D imaging module parameter tests are performed when the parameters are calibrated, where each row of array elements in the sensor array is parallel to the reference plane when at least one 3D imaging module parameter test is performed, and each array element in the sensor array is parallel to the reference plane when at least one 3D imaging module parameter test is performed.
Therefore, in this embodiment, when performing the parameter verification, the following steps are further included:
and rotating the 3D imaging module to change the position of each array element of the sensing array relative to the reference plane, then performing depth detection, establishing a corresponding relation between each block on the reference plane and each array element in the sensing array after changing the direction, and calculating and obtaining the calculation parameters according to the depth value of each block and the space geometric relation between the block and each array element to judge whether the lens internal reference position parameters of the 3D imaging module meet the standard in other directions.
And when the parameters of the 3D imaging module are checked for the first time, each row array element in the sensing array is parallel to the reference plane. In the secondary 3D imaging module parameter inspection, the 3D imaging module is rotated by 90 degrees, so that the rows in the sensing array are parallel to the reference plane, then corresponding debugging operation is carried out, and the parameters of the rows in the sensing array are adjusted and converted into the adjustment of the rows in the sensing array.
At this time, the corresponding relationship between each array element in the sensing array and each block on the reference plane M needs to be determined again, and the slope, the average value, and the like of the second difference function delta2 are obtained.
In fact, the sensing array may also be rotated around an axis perpendicular to the sensing array and parallel to the reference plane, so that each row or column of array elements in the sensing array has another included angle with the reference plane, at this time, when calculating the calculation parameter, the included angle between the array elements and the reference plane needs to be considered, and a person skilled in the art can obtain the corresponding calculation mode without any creative labor based on the calculation mode disclosed in the present application.
An embodiment of the present application also provides a 3D imaging apparatus.
Fig. 4 shows a 3D imaging device according to an embodiment. The 3D imaging device includes a 3D imaging module, a controller 402, and a memory 403.
The 3D imaging module corresponds to the ToF module 401 and is used for acquiring depth information of an area to be detected, and the ToF imaging module comprises a ToF emission unit, a sensing array and an optical device, wherein the ToF emission unit is used for emitting detection optical signals, the sensing array is used for receiving reflected optical signals reflected by the outside, and the optical device is arranged in the light emitting direction of the ToF emission unit and is used for increasing the detection efficiency of the optical signals. The 3D imaging module can acquire a depth image of the reference plane according to the detection optical signal and the reflected optical signal, and determine the depth value of each block on the reference plane.
The controller 402 is connected to the 3D imaging module, and is configured to execute the 3D imaging module parameter checking method in the above embodiment according to the depth information. The controller 402 includes at least one of a programmable logic device, a microcontroller, and a single chip.
The memory 403 stores control program codes for the controller 402 to run, and the control program codes are used for implementing the 3D imaging module parameter checking method.
The 3D imaging apparatus further comprises an input/output device 404 connected to the memory 403 for a user to input the lens parameters of the 3D imaging module and the position parameters relative to the reference plane into the memory.
The 3D imaging device of the application is used for solving the calculation parameters of the 3D imaging module according to the corresponding relation between each array element in the sensing array of the 3D imaging module and the block on the reference plane, the depth value of the detected block, and the lens internal parameters and the position parameters of the 3D imaging module, and judging the accuracy of the lens internal parameters and the position parameters of the 3D imaging module according to the calculation parameters. The 3D imaging module parameter inspection method and the 3D imaging device can be used for inspecting the accuracy of the 3D imaging module parameters so as to carry out corresponding debugging according to an inspection result, and the 3D imaging module can shoot more accurate depth images.
The above-mentioned embodiments are only examples of the present application, and not intended to limit the scope of the present application, and all equivalent structures or equivalent flow transformations made by the contents of the specification and the drawings, such as the combination of technical features between the embodiments and the direct or indirect application to other related technical fields, are also included in the scope of the present application.

Claims (11)

1. A method for checking parameters of a 3D imaging module, wherein the 3D imaging module comprises a sensing array, the method comprises the following steps:
assembling a 3D imaging module to be inspected to a reference plane, wherein the surface of the reference plane is flat, and the lens internal reference of the 3D imaging module and the position parameters relative to the reference plane are known;
using the 3D imaging module to perform depth detection on the reference plane, and acquiring the depth value of each block of the reference plane;
respectively corresponding each row of array elements in the sensing array to each block in the reference plane, and determining the depth value of the block corresponding to each row of array elements;
calculating and obtaining calculation parameters of the 3D imaging module according to the depth value of the block and the space geometric relationship between the block and the array element corresponding to the block;
and judging whether the lens internal parameters and the position parameters of the 3D imaging module are accurate or not according to the calculation parameters.
2. The method of claim 1, wherein the 3D imaging module comprises optics comprising a lens, the center of the lens is at the same height as a center row of the sensing array with respect to the reference plane, the lens intrinsic parameter comprises a theoretical focal length of the lens, and the position parameter comprises a theoretical height of the center row of the sensing array with respect to the reference plane.
3. The method of claim 2, wherein the calculated height of the center row in the sensor array relative to the reference plane is obtained according to the following equation:
Figure FDA0002964390830000011
wherein Y is a calculated height of a center row of the sensor array relative to the reference plane, depth1 is a depth value of the tile x, V is a theoretical height of the tile x relative to the center row, V0 is a theoretical height of the center row of the sensor array relative to the reference plane, and fy is a theoretical focal length of the lens.
4. The method for inspecting parameters of 3D imaging module set according to claim 2, wherein the step of determining whether the lens parameters and the position parameters of the 3D imaging module set are accurate according to the calculated parameters comprises the steps of:
obtaining a difference function of the calculation parameters and the theoretical height of the center row of the sensing array compared with the reference plane;
and judging whether the lens internal parameters and the position parameters are accurate or not according to the difference function.
5. The method of claim 4, wherein the difference function is obtained according to the following equation:
Figure FDA0002964390830000021
wherein delta1(V) is the difference function, depth1 is the depth value of the tile x of the reference plane, V is the theoretical height of the tile x relative to the center line, V0 is the theoretical height of the center line of the sensing array relative to the reference plane, fy is the theoretical focal length of the lens, and H is the theoretical height of the center of the lens relative to the reference plane.
6. The method for inspecting parameters of a 3D imaging module set according to claim 5, wherein the step of determining whether the lens parameters are accurate according to the difference function comprises the steps of:
and acquiring the slope of the difference function, and judging that the theoretical focal length of the lens is accurate if the slope is within a first preset range, or else, judging that the theoretical focal length of the lens is inaccurate.
7. The method for inspecting parameters of a 3D imaging module set according to claim 5, wherein the step of determining whether the position parameters are accurate according to the difference function comprises the steps of:
obtaining the difference value of each row of array elements through the difference function, and obtaining the average number of the difference values of each row of array elements;
and judging whether the theoretical height of the central row of the sensing array compared with the reference plane is accurate or not according to the average of the difference values.
8. The method of claim 7, wherein if the average of the differences is less than a second predetermined threshold, the theoretical height of the center row of the sensor array compared to the reference plane is determined to be accurate, otherwise it is not.
9. The method for inspecting the parameters of the 3D imaging module set according to claim 1, wherein the plane of the array elements of the sensing array is perpendicular to the reference plane, and at least two 3D imaging module parameter inspections are performed, wherein at least one inspection is performed, each row of the array elements of the sensing array is parallel to the reference plane, and at least one inspection is performed, each array element of the sensing array is parallel to the reference plane.
10. A 3D imaging apparatus, comprising:
the 3D imaging module is used for acquiring depth information of the area to be detected;
a controller connected to the 3D imaging module for performing the 3D imaging module parameter verification method of any one of claims 1 to 9 according to the depth information;
and the memory stores control program codes which can be operated by the controller, and the control program codes are used for realizing the 3D imaging module parameter inspection method.
11. The 3D imaging device according to claim 10, further comprising:
and the input and output device is connected to the memory and is used for a user to input the lens internal parameters of the 3D imaging module and the position parameters relative to the reference plane into the memory.
CN202110246843.XA 2021-03-05 2021-03-05 3D imaging module parameter inspection method and 3D imaging device Active CN113115017B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110246843.XA CN113115017B (en) 2021-03-05 2021-03-05 3D imaging module parameter inspection method and 3D imaging device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110246843.XA CN113115017B (en) 2021-03-05 2021-03-05 3D imaging module parameter inspection method and 3D imaging device

Publications (2)

Publication Number Publication Date
CN113115017A true CN113115017A (en) 2021-07-13
CN113115017B CN113115017B (en) 2022-03-18

Family

ID=76710953

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110246843.XA Active CN113115017B (en) 2021-03-05 2021-03-05 3D imaging module parameter inspection method and 3D imaging device

Country Status (1)

Country Link
CN (1) CN113115017B (en)

Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102334007A (en) * 2009-02-25 2012-01-25 莱卡地球系统公开股份有限公司 Leveling device and leveling method
CN105335969A (en) * 2015-10-16 2016-02-17 凌云光技术集团有限责任公司 Acquiring method of space correction parameters of colored line scan camera
CN105333818A (en) * 2014-07-16 2016-02-17 浙江宇视科技有限公司 3D space measurement method based on monocular camera
WO2016070680A1 (en) * 2014-11-06 2016-05-12 Beijing Zhigu Tech Co., Ltd. Methods and apparatus for controlling light field capture
CN105791646A (en) * 2016-03-16 2016-07-20 中国人民解放军国防科学技术大学 Light field imaging device and parameter determination method thereof
US20160349496A1 (en) * 2015-05-27 2016-12-01 Hamamatsu Photonics K.K. Control apparatus and control method for spatial light modulator
US20170248796A1 (en) * 2016-02-29 2017-08-31 Tetravue, Inc. 3d imaging system and method
CN107314761A (en) * 2017-05-24 2017-11-03 上海与德科技有限公司 Measuring method and electronic equipment
CN108629813A (en) * 2018-05-04 2018-10-09 歌尔科技有限公司 A kind of acquisition methods, the device of projection device elevation information
CN109191374A (en) * 2018-10-10 2019-01-11 京东方科技集团股份有限公司 A kind of distortion parameter measurement method, apparatus and system
CN109325981A (en) * 2018-09-13 2019-02-12 北京信息科技大学 Based on the microlens array type optical field camera geometrical parameter calibration method for focusing picture point
CN109596319A (en) * 2018-11-26 2019-04-09 歌尔股份有限公司 The detection system and method for optics module parameter
CN109714536A (en) * 2019-01-23 2019-05-03 Oppo广东移动通信有限公司 Method for correcting image, device, electronic equipment and computer readable storage medium
CN109767476A (en) * 2019-01-08 2019-05-17 像工场(深圳)科技有限公司 A kind of calibration of auto-focusing binocular camera and depth computing method
CN110009693A (en) * 2019-04-01 2019-07-12 清华大学深圳研究生院 A kind of Fast Blind scaling method of light-field camera
CN110246188A (en) * 2019-05-20 2019-09-17 歌尔股份有限公司 Internal reference scaling method, device and camera for TOF camera
US20190339369A1 (en) * 2018-05-04 2019-11-07 Microsoft Technology Licensing, Llc Field Calibration of a Structured Light Range-Sensor
CN110542540A (en) * 2019-07-18 2019-12-06 北京的卢深视科技有限公司 optical axis alignment correction method of structured light module
CN110738707A (en) * 2019-10-16 2020-01-31 北京华捷艾米科技有限公司 Distortion correction method, device, equipment and storage medium for cameras
CN111080705A (en) * 2019-05-07 2020-04-28 像工场(深圳)科技有限公司 Calibration method and device for automatic focusing binocular camera
CN111161358A (en) * 2019-12-31 2020-05-15 华中科技大学鄂州工业技术研究院 Camera calibration method and device for structured light depth measurement
CN111369632A (en) * 2020-03-06 2020-07-03 北京百度网讯科技有限公司 Method and device for acquiring internal parameters in camera calibration
US20200225508A1 (en) * 2019-01-10 2020-07-16 6 Over 6 Vision Ltd. Apparatus, system, and method of determining one or more parameters of a lens
CN111598931A (en) * 2020-04-13 2020-08-28 长安大学 Monocular vision system imaging parameter calibration device and method
CN112070845A (en) * 2020-08-31 2020-12-11 上海爱观视觉科技有限公司 Calibration method and device of binocular camera and terminal equipment
CN112198526A (en) * 2020-09-30 2021-01-08 上海炬佑智能科技有限公司 Reference plane adjustment and obstacle detection method, depth camera and navigation equipment
CN112198529A (en) * 2020-09-30 2021-01-08 上海炬佑智能科技有限公司 Reference plane adjustment and obstacle detection method, depth camera and navigation equipment
CN112254927A (en) * 2019-07-22 2021-01-22 南昌欧菲生物识别技术有限公司 Method and device for testing optical center of camera, computer equipment and storage medium

Patent Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102334007A (en) * 2009-02-25 2012-01-25 莱卡地球系统公开股份有限公司 Leveling device and leveling method
CN105333818A (en) * 2014-07-16 2016-02-17 浙江宇视科技有限公司 3D space measurement method based on monocular camera
WO2016070680A1 (en) * 2014-11-06 2016-05-12 Beijing Zhigu Tech Co., Ltd. Methods and apparatus for controlling light field capture
US20160349496A1 (en) * 2015-05-27 2016-12-01 Hamamatsu Photonics K.K. Control apparatus and control method for spatial light modulator
CN105335969A (en) * 2015-10-16 2016-02-17 凌云光技术集团有限责任公司 Acquiring method of space correction parameters of colored line scan camera
US20170248796A1 (en) * 2016-02-29 2017-08-31 Tetravue, Inc. 3d imaging system and method
CN105791646A (en) * 2016-03-16 2016-07-20 中国人民解放军国防科学技术大学 Light field imaging device and parameter determination method thereof
CN107314761A (en) * 2017-05-24 2017-11-03 上海与德科技有限公司 Measuring method and electronic equipment
CN108629813A (en) * 2018-05-04 2018-10-09 歌尔科技有限公司 A kind of acquisition methods, the device of projection device elevation information
US20190339369A1 (en) * 2018-05-04 2019-11-07 Microsoft Technology Licensing, Llc Field Calibration of a Structured Light Range-Sensor
CN109325981A (en) * 2018-09-13 2019-02-12 北京信息科技大学 Based on the microlens array type optical field camera geometrical parameter calibration method for focusing picture point
CN109191374A (en) * 2018-10-10 2019-01-11 京东方科技集团股份有限公司 A kind of distortion parameter measurement method, apparatus and system
CN109596319A (en) * 2018-11-26 2019-04-09 歌尔股份有限公司 The detection system and method for optics module parameter
CN109767476A (en) * 2019-01-08 2019-05-17 像工场(深圳)科技有限公司 A kind of calibration of auto-focusing binocular camera and depth computing method
US20200225508A1 (en) * 2019-01-10 2020-07-16 6 Over 6 Vision Ltd. Apparatus, system, and method of determining one or more parameters of a lens
CN109714536A (en) * 2019-01-23 2019-05-03 Oppo广东移动通信有限公司 Method for correcting image, device, electronic equipment and computer readable storage medium
CN110009693A (en) * 2019-04-01 2019-07-12 清华大学深圳研究生院 A kind of Fast Blind scaling method of light-field camera
CN111080705A (en) * 2019-05-07 2020-04-28 像工场(深圳)科技有限公司 Calibration method and device for automatic focusing binocular camera
CN110246188A (en) * 2019-05-20 2019-09-17 歌尔股份有限公司 Internal reference scaling method, device and camera for TOF camera
CN110542540A (en) * 2019-07-18 2019-12-06 北京的卢深视科技有限公司 optical axis alignment correction method of structured light module
CN112254927A (en) * 2019-07-22 2021-01-22 南昌欧菲生物识别技术有限公司 Method and device for testing optical center of camera, computer equipment and storage medium
CN110738707A (en) * 2019-10-16 2020-01-31 北京华捷艾米科技有限公司 Distortion correction method, device, equipment and storage medium for cameras
CN111161358A (en) * 2019-12-31 2020-05-15 华中科技大学鄂州工业技术研究院 Camera calibration method and device for structured light depth measurement
CN111369632A (en) * 2020-03-06 2020-07-03 北京百度网讯科技有限公司 Method and device for acquiring internal parameters in camera calibration
CN111598931A (en) * 2020-04-13 2020-08-28 长安大学 Monocular vision system imaging parameter calibration device and method
CN112070845A (en) * 2020-08-31 2020-12-11 上海爱观视觉科技有限公司 Calibration method and device of binocular camera and terminal equipment
CN112198526A (en) * 2020-09-30 2021-01-08 上海炬佑智能科技有限公司 Reference plane adjustment and obstacle detection method, depth camera and navigation equipment
CN112198529A (en) * 2020-09-30 2021-01-08 上海炬佑智能科技有限公司 Reference plane adjustment and obstacle detection method, depth camera and navigation equipment

Also Published As

Publication number Publication date
CN113115017B (en) 2022-03-18

Similar Documents

Publication Publication Date Title
CN108332708B (en) Automatic detection system and detection method for laser level meter
US7019826B2 (en) Optical inspection system, apparatus and method for reconstructing three-dimensional images for printed circuit board and electronics manufacturing inspection
US5812268A (en) Grid array inspection system and method
CN106815867B (en) TOF camera calibration and correction system, and equipment and method thereof
JP3411829B2 (en) Method and apparatus for evaluating surface shape
US4947202A (en) Distance measuring apparatus of a camera
TWI420081B (en) Distance measuring system and distance measuring method
US7271919B2 (en) Confocal displacement sensor
US20080117438A1 (en) System and method for object inspection using relief determination
KR20070034100A (en) System and Method for Simultaneous 3D Height Measurement on Multiple Faces of Objects
US6304680B1 (en) High resolution, high accuracy process monitoring system
WO2008075632A1 (en) Test method for compound-eye distance measuring device, its test device and chart used for same
CN103234483B (en) A kind of detection method of parallelism of camera chip and device
CN102401901B (en) Distance measurement system and distance measurement method
CN113115017B (en) 3D imaging module parameter inspection method and 3D imaging device
CN103512657B (en) The pick-up unit of bore hole 3D LED screen display effect and detection method thereof
CN103297799A (en) Testing an optical characteristic of a camera component
US8102516B2 (en) Test method for compound-eye distance measuring apparatus, test apparatus, and chart used for the same
KR100855849B1 (en) Position variation measuring apparatus and figure inspecting apparatus using the same
CN113050073B (en) Reference plane calibration method, obstacle detection method and distance detection device
KR20160148735A (en) The apparatus for measuring camera principal point and the method thereof
KR19990023868A (en) Distance measuring method
JP2020046229A (en) Three-dimensional measuring device and three-dimensional measuring method
JP7442752B1 (en) Shape inspection method of inspected object
CN113192144A (en) ToF module parameter correction method, ToF device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant