CN111028294A - Multi-distance calibration method and system based on depth camera - Google Patents

Multi-distance calibration method and system based on depth camera Download PDF

Info

Publication number
CN111028294A
CN111028294A CN201910997191.6A CN201910997191A CN111028294A CN 111028294 A CN111028294 A CN 111028294A CN 201910997191 A CN201910997191 A CN 201910997191A CN 111028294 A CN111028294 A CN 111028294A
Authority
CN
China
Prior art keywords
depth
value
fitting
pixel
pixel block
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910997191.6A
Other languages
Chinese (zh)
Other versions
CN111028294B (en
Inventor
刘贤焯
黄志明
王琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Orbbec Co Ltd
Original Assignee
Shenzhen Orbbec Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Orbbec Co Ltd filed Critical Shenzhen Orbbec Co Ltd
Priority to CN201910997191.6A priority Critical patent/CN111028294B/en
Publication of CN111028294A publication Critical patent/CN111028294A/en
Priority to PCT/CN2020/090926 priority patent/WO2021077731A1/en
Application granted granted Critical
Publication of CN111028294B publication Critical patent/CN111028294B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The invention discloses a multi-distance calibration method and a system based on a depth camera, which comprises the following steps: acquiring a plurality of depth images at a plurality of preset distances by a depth camera; calculating each measured depth value of a preset number of pixel blocks in each depth image; constructing a polynomial fitting function, and fitting the measured depth value and the actual depth value of each pixel block according to the polynomial fitting function; respectively calculating the value of the undetermined coefficient in the polynomial fitting function of each pixel block; and taking the value of the undetermined coefficient as a fitting parameter of each pixel block to be stored and generating a corresponding lookup table. The multi-distance calibration method improves the calibration precision of the depth camera and is convenient for obtaining accurate depth information in the subsequent application of the depth camera.

Description

Multi-distance calibration method and system based on depth camera
Technical Field
The invention belongs to the technical field of computer vision, and particularly relates to a multi-distance calibration method and system based on a depth camera.
Background
In image measurement processes and machine vision applications, in order to determine the correlation between the three-dimensional geometric position of a certain point on the surface of an object in space and the corresponding point in the image, a geometric model of camera imaging must be established, and the parameters of the geometric model are the parameters of the camera. Under most conditions, the parameters must be obtained through experiments and calculation, and the process of solving the parameters is called camera calibration. In image measurement or machine vision application, calibration of camera parameters is a very critical link, and the accuracy of a calibration result and the stability of an algorithm directly influence the accuracy of a result generated by the operation of a camera.
At present, a depth camera is gradually paid attention by various industries, and the depth camera can acquire depth information of a target so as to realize 3D scanning, scene modeling and gesture interaction. When a depth camera is used for measuring a depth image of a target, clear and accurate depth information is an important index for measuring the performance of the depth camera, however, due to the influences of temperature, illumination, scenes and the like, the depth information acquired by the depth camera often has a large error. On the other hand, errors inevitably occur in the manufacturing and assembling processes of the depth camera components, and systematic errors are directly brought to the measurement. Therefore, the calibration of the depth camera becomes more important, and the improvement of the calibration precision of the depth camera plays a key role in the subsequent application of the depth camera.
The above background disclosure is only for the purpose of assisting understanding of the inventive concept and technical solutions of the present invention, and does not necessarily belong to the prior art of the present patent application, and should not be used for evaluating the novelty and inventive step of the present application in the case that there is no clear evidence that the above content is disclosed at the filing date of the present patent application.
Disclosure of Invention
The present invention is directed to a multi-distance calibration method and system based on a depth camera, so as to solve at least one of the above-mentioned problems.
In order to achieve the above purpose, the technical solution of the embodiment of the present invention is realized as follows:
a multi-distance calibration method based on a depth camera comprises the following steps:
acquiring a plurality of depth images at a plurality of preset distances by a depth camera;
calculating each measured depth value of a preset number of pixel blocks in each depth image;
constructing a polynomial fitting function and fitting the measured depth value and the actual depth value of each pixel block according to the polynomial fitting function;
respectively calculating the value of the undetermined coefficient in the polynomial fitting function of each pixel block;
and taking the value of the undetermined coefficient as a fitting parameter of each pixel block to be stored and generating a corresponding lookup table.
In some embodiments, further comprising: searching the lookup table according to the pixel blocks to obtain corresponding fitting parameters so as to generate corresponding polynomial fitting functions; and calculating the actual depth value according to the input measured depth value of the pixel block and the corresponding polynomial fitting function.
In some embodiments, further comprising: and calculating the average value of the actual depth values of the preset number of pixel blocks and taking the average value as the actual depth value of the depth image.
In some embodiments, the block of pixels is a single pixel.
In some embodiments, the polynomial fit function is:
Zg=a0*Zt 2+a1*Zt+a2
wherein Z isgAs actual depth value, ZtFor measuring depth values, the fitting parameters of the pixel block are [ a0, a1, a2]]And the fitting parameters of all the pixel blocks in the preset number of pixel blocks are different.
The other technical scheme of the invention is as follows:
a depth camera based multi-distance calibration system comprising: the acquisition unit is used for acquiring a plurality of depth images at a plurality of preset distances; a first calculation unit for calculating each measured depth value of a preset number of pixel blocks in each depth image; the fitting unit is used for constructing a polynomial fitting function and fitting each measured depth value with an actual depth value according to the polynomial fitting function; the second calculation unit is used for calculating the values of undetermined coefficients in the polynomial function of the pixel block respectively; and the storage unit is used for storing the value of the undetermined coefficient as a fitting parameter of each pixel block and generating a corresponding lookup table.
In some embodiments, further comprising: the searching unit is used for searching the lookup table according to the pixel blocks to obtain corresponding fitting parameters so as to generate corresponding polynomial fitting functions;
and a third calculation unit for calculating an actual depth value according to the measured depth value of the input pixel block and the corresponding polynomial fitting function.
In some embodiments, further comprising: and the fourth calculation unit is used for calculating the average value of the actual depth values of the preset number of pixel blocks and taking the average value as the actual depth value of the depth image.
In some embodiments, the block of pixels is a single pixel.
In some embodiments, the polynomial fit function is:
Zg=a0*Zt 2+a1*Zt+a2
wherein Z isgAs actual depth value, ZtFor measuring depth values, the fitting parameters of the pixel block are [ a0, a1, a2]]And the fitting parameters of all the pixel blocks in the preset number of pixel blocks are different.
The technical scheme of the invention has the beneficial effects that:
compared with the prior art, the multi-distance calibration method improves the calibration precision of the depth camera, and is convenient for obtaining accurate depth information in the subsequent application of the depth camera.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a flowchart illustration of a depth camera-based multi-distance scaling method according to one embodiment of the invention.
FIG. 2 is a block diagram illustration of a depth camera based multi-distance calibration system according to another embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood by those skilled in the art, the technical solutions in the embodiments of the present invention will be described below clearly and completely with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments obtained by those of ordinary skill in the art based on the embodiments of the present invention should fall within the protection scope of the present invention without any creative effort. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
It will be understood that when an element is referred to as being "secured to" or "disposed on" another element, it can be directly on the other element or be indirectly on the other element. When an element is referred to as being "connected to" another element, it can be wired or wirelessly connected to the other element for data transfer purposes. In the description of the embodiments of the present invention, "a plurality" means two or more unless specifically limited otherwise.
Furthermore, the descriptions in the description of the invention, the claims, and the drawings referring to "first" or "second", etc. are only used for distinguishing between similar objects and are not to be construed as indicating or implying any relative importance or implicitly indicating the number of technical features indicated, i.e. these descriptions are not necessarily used for describing a particular order or sequence. Furthermore, it should be understood that the descriptions may be interchanged under appropriate circumstances in order to describe embodiments of the invention.
As an embodiment of the present invention, a multi-distance calibration method based on a depth camera is provided. Referring to fig. 1, fig. 1 is a flowchart illustrating an implementation process of a multi-distance calibration method based on a depth camera according to an embodiment of the present invention, where the method includes steps S110 to S150. The method is suitable for the situation of multi-distance calibration of the depth camera. It will be appreciated that the method may be performed by a multi-range scaling system and may be implemented by software and/or hardware. The specific realization principle of each step is as follows:
and S110, acquiring a plurality of depth images at a plurality of preset distances through a depth camera.
The depth images of the same target object are acquired at different distances by using the depth camera, for example, one depth image can be acquired at a distance interval of 1m-2m from the depth camera at a distance of 110cm, 120cm, 130cm, 140cm, 150cm, 160cm, 170cm, 180cm, 190 cm and 200cm from the same target object at an interval of 10cm, so as to calibrate different distances. It should be noted that in practical applications, in order to improve the calibration accuracy, multiple measurements at multiple different distances are often required, and this embodiment is only for better illustration, so that a distance of 1m to 2m is selected and calibration is performed at intervals of 10cm, but it should not be construed as a limitation to the present invention.
The depth camera may be a depth camera based on structured light, binocular, TOF (time of flight algorithm), and other technical schemes. In the present invention, a depth camera is taken as an example of a structured light depth camera, and generally, the structured light depth camera includes a transmitting module and a receiving module. In one embodiment, to avoid interference of ambient light, the image capturing process is preferably performed indoors, the structured light pattern emitted by the emitting module is an infrared speckle pattern, the receiving module is an infrared camera, the structured light pattern is captured by the infrared camera and then output to the processor, and the processor obtains the depth image of the target object by calculating the structured light pattern.
And S120, calculating each measured depth value of a preset number of pixel blocks in each depth image.
It should be noted that, unlike the color image in which the pixel values represent color values, the depth image reflects depth information of the target object, and the pixel values represent measured distance values from the target object to the depth camera. Therefore, the measured depth value of each pixel block can be obtained by scanning a predetermined number of pixel blocks in a predetermined direction, and it can be understood that, in order to ensure the continuity of the change of the depth value of each pixel block in the depth image, the pixel block in the transition region between the pixel blocks can be obtained by interpolating the depth values of the surrounding blocks. In an embodiment, the pixel block is a single pixel in size.
S130, constructing a polynomial fitting function and fitting a plurality of pairs of measured depth values and actual depth values of each pixel block according to the polynomial fitting function.
After the measured depth value of each pixel block is obtained, a polynomial fitting function Z is constructedg=a0*Zt 2+a1*Zt+ a2 to fit the measured depth value versus the actual depth value, where ZtFor measuring depth values, ZgFor an actual depth value, the actual depth value, i.e. the physical distance of the target object from the depth camera, may be obtained by manual measurement. It should be noted that, due to the distortion of the lens, the depth value measured in real time is deviated from the true depth value, and therefore the fitting is to correct the deviation between the two.
And S140, respectively calculating the value of the undetermined coefficient in the polynomial fitting function of each pixel block.
In one embodiment, for each pixel block, the corresponding fitting parameters [ a0, a1, a2] can be obtained by inputting a plurality of pairs of measured depth values and true depth values at different distances, and solving the equation system, wherein the number of measured values at different depths should be larger than the number of fitting parameters. It will be appreciated that the greater the number of measurements, the more accurate the calculated fit parameters. Since the distortion at each position on the screen is different, the fitting parameters [ a0, a1, a2] of each pixel block are different. Thus, when the depth value is calculated in real time, the depth correction can be performed for each pixel block, and the precision of the actual depth value can be greatly improved.
S150, storing the value of the undetermined coefficient as a fitting parameter of each pixel block and generating a corresponding query table.
After the fitting parameters for each pixel block are found, the fitting parameters for each pixel block are stored in memory to form a look-up table. In one embodiment, each pixel block has a set of fitting parameters. Therefore, when the depth value of the target object is measured in real time, corresponding fitting parameters in the query table can be searched according to each pixel block in the acquired depth image to generate a corresponding polynomial fitting function, the measured depth value is further corrected according to the generated polynomial fitting function, therefore, the multi-point depth values in the depth image can be corrected rapidly at the same time, and the average value of the corrected depth values can be used as the depth value of the target object.
In an embodiment, the actual depth values are calculated from the measured depth values of the input pixel blocks and the corresponding polynomial fitting function, and an average value of the actual depth values of the preset number of pixel blocks is calculated and used as the actual depth value of the depth image.
It should be noted that the number of sets of the fitting parameters represents correctable pixel points, and generally, the more the number of sets of the fitting parameters is, the more correctable pixel points are, the higher the accuracy of the depth value of the finally obtained target object is, and the speed of correction is relatively slow, so that the number of sets of the fitting parameters can be balanced according to requirements on the accuracy and the speed.
Another embodiment of the present invention is a multi-distance calibration system based on a depth camera, please refer to fig. 2. Fig. 2 is a schematic diagram of a multi-distance calibration system based on a depth camera, and the system 20 includes an acquisition unit 201, a first calculation unit 202, a fitting unit 203, a second calculation unit 204, and a storage unit 205.
Wherein the acquisition unit 201 is configured to acquire a plurality of depth images at a plurality of preset distances;
the first calculating unit 202 is configured to calculate each measured depth value of a preset number of pixel blocks in each depth image;
the fitting unit 203 is configured to construct a polynomial fitting function and fit each measured depth value with an actual depth value according to the polynomial fitting function;
the second calculating unit 204 is configured to calculate values of undetermined coefficients in the polynomial function of the pixel block;
the storage unit 205 is configured to store a value of the undetermined coefficient as a fitting parameter of each pixel block, and generate a corresponding lookup table. In one embodiment, the pixel block is a single pixel and the polynomial fit function is Zg=a0*Zt 2+a1*Zt+ a2 wherein ZgAs actual depth value, ZtTo measure the depth values, the pixel block has fitting parameters of [ a0, a1, a2]]And the fitting parameters of each pixel block in the preset number of pixel blocks are different.
In one embodiment, the system 20 further comprises a look-up unit 206, a third calculation unit 207, and a fourth calculation unit 208.
The searching unit 206 is configured to search the lookup table according to the pixel block to obtain a corresponding fitting parameter, so as to generate a corresponding polynomial fitting function;
the third calculating unit 207 is configured to calculate an actual depth value according to the measured depth value of the input pixel block and the corresponding polynomial fitting function;
the fourth calculation unit 208 is configured to calculate an average value of the actual depth values of the preset number of pixel blocks as the actual depth value of the depth image.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a server, the steps of the method embodiments may be implemented.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (10)

1. A multi-distance calibration method based on a depth camera is characterized by comprising the following steps:
acquiring a plurality of depth images at a plurality of preset distances by a depth camera;
calculating each measured depth value of a preset number of pixel blocks in each depth image;
constructing a polynomial fitting function and fitting the measured depth value and the actual depth value of each pixel block according to the polynomial fitting function;
respectively calculating the value of a to-be-determined coefficient in the polynomial fitting function of each pixel block;
and taking the value of the undetermined coefficient as a fitting parameter of each pixel block to be stored and generating a corresponding lookup table.
2. The multi-distance calibration method according to claim 1, comprising:
searching the lookup table according to the pixel blocks to obtain corresponding fitting parameters so as to generate corresponding polynomial fitting functions;
and calculating the actual depth value according to the input measured depth value of the pixel block and the corresponding polynomial fitting function.
3. The multi-distance calibration method according to claim 2, comprising:
and calculating the average value of the actual depth values of the preset number of pixel blocks and taking the average value as the actual depth value of the depth image.
4. The multi-distance calibration method according to claim 1, wherein the pixel block is a single pixel.
5. The multi-distance calibration method according to any one of claims 1 to 4, wherein the polynomial fitting function is
Zg=a0*Zt 2+a1*Zt+a2
Wherein Z isgAs actual depth value, ZtFor measuring depth values, the fitting parameters of the pixel block are [ a0, a1, a2]]And the fitting parameters of all the pixel blocks in the preset number of pixel blocks are different.
6. A depth camera based multi-distance calibration system, comprising:
the acquisition unit is used for acquiring a plurality of depth images at a plurality of preset distances;
a first calculation unit for calculating each measured depth value of a preset number of pixel blocks in each depth image;
the fitting unit is used for constructing a polynomial fitting function and fitting each measured depth value with an actual depth value according to the polynomial fitting function;
the second calculation unit is used for respectively calculating the values of undetermined coefficients in the polynomial function of the pixel block;
and the storage unit is used for storing the value of the undetermined coefficient as a fitting parameter of each pixel block and generating a corresponding lookup table.
7. The multi-distance calibration system according to claim 6, further comprising:
the searching unit is used for searching the lookup table according to the pixel blocks to obtain corresponding fitting parameters so as to generate corresponding polynomial fitting functions;
a third calculation unit for calculating actual depth values from the measured depth values of the input pixel blocks and from the corresponding polynomial fitting function.
8. The multi-distance calibration system according to claim 7, further comprising:
and the fourth calculation unit is used for calculating the average value of the actual depth values of the preset number of pixel blocks and taking the average value as the actual depth value of the depth image.
9. The multi-distance calibration system of claim 8 wherein said block of pixels is a single pixel.
10. The multi-distance calibration system according to any one of claims 6 to 9, wherein said polynomial fitting function is:
Zg=a0*Zt 2+a1*Zt+a2
wherein Z isgAs actual depth value, ZtFor measuring depth values, the fitting parameters of the pixel block are [ a0, a1, a2]]And the fitting parameters of all the pixel blocks in the preset number of pixel blocks are different.
CN201910997191.6A 2019-10-20 2019-10-20 Multi-distance calibration method and system based on depth camera Active CN111028294B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910997191.6A CN111028294B (en) 2019-10-20 2019-10-20 Multi-distance calibration method and system based on depth camera
PCT/CN2020/090926 WO2021077731A1 (en) 2019-10-20 2020-05-18 Multi-distance calibration method and system based on depth camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910997191.6A CN111028294B (en) 2019-10-20 2019-10-20 Multi-distance calibration method and system based on depth camera

Publications (2)

Publication Number Publication Date
CN111028294A true CN111028294A (en) 2020-04-17
CN111028294B CN111028294B (en) 2024-01-16

Family

ID=70201037

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910997191.6A Active CN111028294B (en) 2019-10-20 2019-10-20 Multi-distance calibration method and system based on depth camera

Country Status (2)

Country Link
CN (1) CN111028294B (en)
WO (1) WO2021077731A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112270712A (en) * 2020-09-08 2021-01-26 奥比中光科技集团股份有限公司 Temperature drift calibration method and system based on depth camera module
WO2021077731A1 (en) * 2019-10-20 2021-04-29 深圳奥比中光科技有限公司 Multi-distance calibration method and system based on depth camera
CN113295089A (en) * 2021-04-07 2021-08-24 深圳市异方科技有限公司 Compartment volume rate measuring method based on visual inertia SLAM
CN113295089B (en) * 2021-04-07 2024-04-26 深圳市异方科技有限公司 Carriage volume rate measuring method based on visual inertia SLAM

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150279016A1 (en) * 2014-03-27 2015-10-01 Electronics And Telecommunications Research Institute Image processing method and apparatus for calibrating depth of depth sensor
CN105160680A (en) * 2015-09-08 2015-12-16 北京航空航天大学 Design method of camera with no interference depth based on structured light
CN107545586A (en) * 2017-08-04 2018-01-05 中国科学院自动化研究所 Based on the local depth acquisition methods of light field limit plane picture and system
CN108648239A (en) * 2018-05-04 2018-10-12 苏州富强科技有限公司 The scaling method of phase height mapping system based on segmented fitting of a polynomial
CN109272556A (en) * 2018-08-31 2019-01-25 青岛小鸟看看科技有限公司 A kind of scaling method and device of flight time TOF camera
CN109741384A (en) * 2018-12-18 2019-05-10 深圳奥比中光科技有限公司 The more distance detection devices and method of depth camera
CN109767476A (en) * 2019-01-08 2019-05-17 像工场(深圳)科技有限公司 A kind of calibration of auto-focusing binocular camera and depth computing method
CN110232715A (en) * 2019-05-08 2019-09-13 深圳奥比中光科技有限公司 A kind of self-alignment method, apparatus of more depth cameras and system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104331896B (en) * 2014-11-21 2017-09-08 天津工业大学 A kind of system calibrating method based on depth information
US10771776B2 (en) * 2017-09-12 2020-09-08 Sony Corporation Apparatus and method for generating a camera model for an imaging system
CN109816735B (en) * 2019-01-24 2022-10-21 哈工大机器人(合肥)国际创新研究院 Rapid calibration and correction method and TOF camera thereof
CN111028294B (en) * 2019-10-20 2024-01-16 奥比中光科技集团股份有限公司 Multi-distance calibration method and system based on depth camera

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150279016A1 (en) * 2014-03-27 2015-10-01 Electronics And Telecommunications Research Institute Image processing method and apparatus for calibrating depth of depth sensor
CN105160680A (en) * 2015-09-08 2015-12-16 北京航空航天大学 Design method of camera with no interference depth based on structured light
CN107545586A (en) * 2017-08-04 2018-01-05 中国科学院自动化研究所 Based on the local depth acquisition methods of light field limit plane picture and system
CN108648239A (en) * 2018-05-04 2018-10-12 苏州富强科技有限公司 The scaling method of phase height mapping system based on segmented fitting of a polynomial
CN109272556A (en) * 2018-08-31 2019-01-25 青岛小鸟看看科技有限公司 A kind of scaling method and device of flight time TOF camera
CN109741384A (en) * 2018-12-18 2019-05-10 深圳奥比中光科技有限公司 The more distance detection devices and method of depth camera
CN109767476A (en) * 2019-01-08 2019-05-17 像工场(深圳)科技有限公司 A kind of calibration of auto-focusing binocular camera and depth computing method
CN110232715A (en) * 2019-05-08 2019-09-13 深圳奥比中光科技有限公司 A kind of self-alignment method, apparatus of more depth cameras and system

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021077731A1 (en) * 2019-10-20 2021-04-29 深圳奥比中光科技有限公司 Multi-distance calibration method and system based on depth camera
CN112270712A (en) * 2020-09-08 2021-01-26 奥比中光科技集团股份有限公司 Temperature drift calibration method and system based on depth camera module
CN113295089A (en) * 2021-04-07 2021-08-24 深圳市异方科技有限公司 Compartment volume rate measuring method based on visual inertia SLAM
CN113295089B (en) * 2021-04-07 2024-04-26 深圳市异方科技有限公司 Carriage volume rate measuring method based on visual inertia SLAM

Also Published As

Publication number Publication date
CN111028294B (en) 2024-01-16
WO2021077731A1 (en) 2021-04-29

Similar Documents

Publication Publication Date Title
US9858684B2 (en) Image processing method and apparatus for calibrating depth of depth sensor
US11335020B2 (en) Method and system for correcting temperature error of depth camera
JP5285619B2 (en) Camera system calibration
CN107657635B (en) Depth camera temperature error correction method and system
US8306323B2 (en) Method and apparatus for correcting depth image
US8265343B2 (en) Apparatus, method and program for distance measurement
CN106875435B (en) Method and system for obtaining depth image
JP6041513B2 (en) Image processing apparatus, image processing method, and program
CN107860337B (en) Structured light three-dimensional reconstruction method and device based on array camera
KR20110052993A (en) Method and apparatus for compensating image
US11403745B2 (en) Method, apparatus and measurement device for measuring distortion parameters of a display device, and computer-readable medium
CN110349257B (en) Phase pseudo mapping-based binocular measurement missing point cloud interpolation method
CN111028294B (en) Multi-distance calibration method and system based on depth camera
US10499038B2 (en) Method and system for recalibrating sensing devices without familiar targets
CN114111633A (en) Projector lens distortion error correction method for structured light three-dimensional measurement
CN110542540A (en) optical axis alignment correction method of structured light module
CN108921902B (en) Method and device for correcting structured light camera deviation
KR102185329B1 (en) Distortion correction method of 3-d coordinate data using distortion correction device and system therefor
CN115018922A (en) Distortion parameter calibration method, electronic device and computer readable storage medium
CN110232715B (en) Method, device and system for self calibration of multi-depth camera
Fu et al. Calibration of Projector with Fixed Pattern and Large Distortion Lens in a Structured Light System.
CN112305524A (en) Ranging method, ranging system, and computer-readable storage medium
CN112115930A (en) Method and device for determining pose information
CN116818129B (en) Temperature estimation and thermal distortion correction method applied to structured light reconstruction
KR0139572B1 (en) Image distortion calibration method using nervous network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 11-13 / F, joint headquarters building, high tech Zone, 63 Xuefu Road, Yuehai street, Nanshan District, Shenzhen, Guangdong 518000

Applicant after: Obi Zhongguang Technology Group Co.,Ltd.

Address before: 11-13 / F, joint headquarters building, high tech Zone, 63 Xuefu Road, Yuehai street, Nanshan District, Shenzhen, Guangdong 518000

Applicant before: SHENZHEN ORBBEC Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant