CN105306922A - Method and device for obtaining depth camera reference diagram - Google Patents

Method and device for obtaining depth camera reference diagram Download PDF

Info

Publication number
CN105306922A
CN105306922A CN201410334261.7A CN201410334261A CN105306922A CN 105306922 A CN105306922 A CN 105306922A CN 201410334261 A CN201410334261 A CN 201410334261A CN 105306922 A CN105306922 A CN 105306922A
Authority
CN
China
Prior art keywords
spot
reference diagram
white point
pattern
optical element
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410334261.7A
Other languages
Chinese (zh)
Other versions
CN105306922B (en
Inventor
王琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201410334261.7A priority Critical patent/CN105306922B/en
Publication of CN105306922A publication Critical patent/CN105306922A/en
Application granted granted Critical
Publication of CN105306922B publication Critical patent/CN105306922B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Measurement Of Optical Distance (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a method for obtaining a depth camera reference diagram, belonging to the technical field of image processing. The method comprises the following steps: forming a speckle pattern on a reference plane by adopting a laser transmitting device and a diffractive optical component; photographing and collecting the speckle pattern by adopting a photographing device to obtain a first reference diagram; determining a mapping relationship between a spot in the first reference diagram and a white point in the pattern of the diffractive optical component; and simulating and synthesizing the depth camera reference diagram according to the mapping relationship and the pattern of the diffractive optical component. According to the method and device for obtaining the depth camera reference diagram provided by the invention, the speckle pattern is photographed and collected through the photographing device and used as the first reference diagram; the mapping relationship between the spot in the first reference diagram and the white point in the pattern of the diffractive optical component is obtained; the depth camera reference diagram is simulated and synthesised according to the mapping relationship; therefore, the secondary scattering noise points of laser are effectively avoided; furthermore, other external equipment is unnecessary to increase; the equipment cost is reduced; and the invention further discloses the device for obtaining the depth camera reference diagram.

Description

A kind of acquisition methods of depth camera reference diagram and device
Technical field
The present invention relates to technical field of image processing, particularly a kind of acquisition methods of depth camera reference diagram and device.
Background technology
Current image processing techniques is processed image, to meet human psychology, and the one process behavior of vision or other application demands.Along with the continuous increase that development and the people of science and technology do not ask, the range of application of image processing techniques is increasingly extensive.
Depth camera owing to can realize the range finding of 3-D view, and is shown up prominently gradually.One of key technology of existing depth camera obtains reference diagram.According to the position of each speckle in reference diagram, calculate the spatial value of other each points in real space.The method that current depth camera obtains reference diagram is that laser projector sends several ten thousand laser beams through diffraction optical element module, projects on a flat surface, forms speckle point set.The pattern of speckle pattern is determined by the etching pattern frame of diffraction optical element.The speckle point set dropped on flat surface taken by the camera of depth camera inside, and storage picture in depth camera as with reference to figure.This reference diagram is for calculating the depth map of actual scene.
But there is the problem that laser rescattering is mixed a little in said method: the physical property of actual flat surface, must be the surface having certain roughness, make phase chance collect surface scattering, this scattering be called " rescattering speckle ".At present, in order to solve the problem, the method for horizontal movement camera plane can be adopted to suppress secondary speckle, but when taking horizontal movement to take, guide rail and camera head need be added, add equipment cost.
Summary of the invention
In order to overcome the problems referred to above, the invention provides a kind of acquisition methods and device of depth camera reference diagram.Described technical scheme is as follows:
On the one hand, the invention provides a kind of depth camera reference diagram acquisition methods, described depth camera comprises laser beam emitting device, diffraction optical element and camera head, described laser beam emitting device is for Emission Lasers and described laser is projected reference planes by described diffraction optical element, the speckle pattern that described camera head is formed in described reference planes for taking described laser, the pattern of described diffraction optical element comprises multiple white point, described laser forms a spot respectively after white point described in each in described reference planes, described method comprises:
Described laser beam emitting device and described diffraction optical element is adopted to form described speckle pattern in described reference planes;
Adopt speckle pattern described in described camera head shooting, collecting, obtain the first reference diagram;
Determine the mapping relations of the described white point in the pattern of spot and described diffraction optical element in described first reference diagram;
According to the pattern of described mapping relations and described diffraction optical element, analog synthesis depth camera reference diagram.
Further, the mapping relations of the described white point in the described pattern determining spot and described diffraction optical element in described first reference diagram, comprising:
Determine spot corresponding with the appointment white point in the pattern of described diffraction optical element in described first reference diagram;
In the same coordinate system, determine the coordinate of described appointment white point and spot corresponding to described appointment white point;
According to the coordinate of described appointment white point and spot corresponding to described appointment white point, determine the mapping relations of spot and described white point in described first reference diagram.
In the enforceable mode of one, the shape of described appointment white point is different from the shape of other white points except described indication white point.
Further, the described pattern according to described mapping relations and described diffraction optical element, analog synthesis depth camera reference diagram, comprising:
According to the albomaculatus coordinate of institute and the described mapping relations of the pattern of described diffraction optical element, determine the position of the whole spots in described reference planes;
Adopt the brightness of each spot in reference planes described in the analog computation of two-dimensional Gaussian function convolution, described in each, the position simulation of spot forms spot, obtains described depth camera reference diagram.
Further, the albomaculatus coordinate of institute of the described pattern according to described diffraction optical element and described mapping relations, determine the position of the whole spots in described reference planes, comprising:
According to the albomaculatus coordinate of institute and the described mapping relations of the pattern of described diffraction optical element, determine the coordinate of the whole spots in described reference planes;
Described in each spot coordinate around specified scope in, the position of search brightness maximum point is as the position of spot described in each.
Further, described mapping relations comprise Linear Mapping relation or Nonlinear Mapping relation.
On the other hand, the invention provides a kind of depth camera reference diagram acquisition device, described depth camera comprises laser beam emitting device, diffraction optical element and camera head, described laser beam emitting device is for Emission Lasers and described laser is projected reference planes by described diffraction optical element, the speckle pattern that described camera head is formed in described reference planes for taking described laser, the pattern of described diffraction optical element comprises multiple white point, described laser forms a spot respectively after white point described in each in described reference planes, described device comprises:
First reference diagram acquisition module, for obtaining the first reference diagram that speckle pattern that reference planes described in described camera head shooting, collecting are formed obtains;
Determination module, for determining the mapping relations of the described white point in the pattern of spot and described diffraction optical element in described first reference diagram;
Depth camera reference diagram acquisition module, for the pattern according to described mapping relations and described diffraction optical element, analog synthesis depth camera reference diagram.
Further, described determination module comprises:
First sub-determination module, for determining spot corresponding with the appointment white point in the pattern of described diffraction optical element in described first reference diagram;
Second sub-determination module, in the same coordinate system, determines the coordinate of described appointment white point and spot corresponding to described appointment white point;
3rd sub-determination module, for the coordinate according to described appointment white point and spot corresponding to described appointment white point, determines the mapping relations of spot and described white point in described first reference diagram.
In the enforceable mode of one, the shape of described appointment white point is different from the shape of other white points except described indication white point.
Further, described depth camera reference diagram acquisition module comprises:
Speckle displacement acquisition module, for the albomaculatus coordinate of institute and the described mapping relations of the pattern according to described diffraction optical element, determines the position of the whole spots in described reference planes;
Spot simulation forms module, and for adopting the brightness of each spot in reference planes described in the analog computation of two-dimensional Gaussian function convolution, described in each, the position simulation of spot forms spot.
Further, described speckle displacement acquisition module comprises:
Spot coordinate obtaining module, for the albomaculatus coordinate of institute and the described mapping relations of the pattern according to described diffraction optical element, determines the coordinate of the whole spots in described reference planes;
Speckle displacement search module, in the specified scope around the coordinate of spot described in each, searches for the position of position as spot described in each of brightness maximum point.
Further, described mapping relations comprise Linear Mapping relation or Nonlinear Mapping relation.
The present invention by camera head shooting, collecting speckle pattern as the first reference diagram, the mapping relations between the white point obtaining the pattern of spot and diffraction optical element in the first reference diagram.And according to these mapping relations, analog synthesis depth camera reference diagram.Effectively prevent laser rescattering assorted some problem.And without the need to adding other external equipments, reduce equipment cost.
Accompanying drawing explanation
Fig. 1 a is the structural representation of the depth camera that the embodiment of the present invention provides;
Fig. 1 b is the fundamental diagram of the depth camera that the embodiment of the present invention provides;
Fig. 2 is the flow chart of the depth camera reference diagram acquisition methods provided in the embodiment of the present invention one;
Fig. 3 is the flow chart of another depth camera reference diagram acquisition methods provided in the embodiment of the present invention two;
Fig. 4 is the pattern of diffraction optical element in the embodiment of the present invention two;
Fig. 5 is the first reference diagram obtained after taking speckle pattern in the embodiment of the present invention two;
Fig. 6 is convolutional calculation block diagram in the embodiment of the present invention two;
Fig. 6 a, 6b, 6c are the block diagram by convolution simulation hot spot in the embodiment of the present invention two;
Fig. 7 is the pattern of the depth camera reference diagram acquisition methods localized mass before treatment adopting the invention process two;
Fig. 8 is based on the pattern of this localized mass shown in Fig. 7 after the process of depth camera reference diagram acquisition methods;
Fig. 9 is the module map of the depth camera reference diagram acquisition device provided in the embodiment of the present invention three;
Figure 10 is the module map of the depth camera reference diagram acquisition device provided in the embodiment of the present invention four.
Embodiment
For making the object, technical solutions and advantages of the present invention clearly, below in conjunction with accompanying drawing, embodiment of the present invention is described further in detail.
For the ease of understanding the embodiment of the present invention, first the structure and working principle of composition graphs 1a to depth camera is simply introduced below.As shown in Figure 1a, depth camera comprises laser beam emitting device 11, diffraction optical element (DiffractiveOpticalElements is called for short DOE) 12 and camera head 13.Laser beam emitting device 11 is for Emission Lasers and laser is projected reference planes 14 by diffraction optical element 12.Reference planes 14 are a smooth plane.The speckle pattern that camera head 13 is formed in reference planes 14 for taking laser.The pattern of diffraction optical element 12 comprises multiple white point, laser forms a spot (diffraction optical element 12 is equivalent to cover the sheet glass on laser beam emitting device 11, sheet glass is established multiple white point) respectively after each white point in reference planes 14.
Can think that the spot in reference planes 14 and the white point on diffraction optical element 12 are one to one.Per pass passes through the laser of white point after collimation, there is no distortion and disperses, and finally dropping on a formation spot in reference planes 14.The spotted set of institute forms speckle pattern.Therefore known, the figure of speckle pattern is determined by the pattern of diffraction optical element 12.
When the laser of laser beam emitting device 11 transmitting is after diffraction optical element 12, forms several ten thousand laser beams, project in reference planes 14, form speckle pattern.This speckle pattern taken by camera head 13, and the reference diagram as depth camera stores, and this reference diagram is used as the reference pattern of the depth map calculating actual scene in real time.
The principle of the depth map of depth camera calculating actual scene is the method based on triangulation.See Fig. 1 b, the laser beam (for a light beam) that laser beam emitting device 11 (such as projecting apparatus) is launched is radiated in the Different Plane of distance for Z1, Z2, and wherein Z1 is reference planes.The horizontal displacement from Xc1 to Xc2 can be produced at the corresponding hot spot point of imageing sensor (Charge-coupledDevice is called for short CCD) the upper imaging of camera head 13.According to the principle of triangulation, when actual distance value (i.e. a constant Z1) of known reference plane Z1, by detecting the displacement of two the spot imagings on CCD formed on reference planes Z1 and any distance plane Zk with Ray Of Light, the actual distance value of Zk can be known by inference.
That is, the speckle pattern of necessary stored reference plane Z1 in depth camera, and known reference planes Z1 actual distance value.When inputting the image of any distance plane Zk, just range finding can be realized by the spot shift amount detected in camera head 13 on CCD.
Embodiment 1
Embodiments provide a kind of depth camera reference diagram acquisition methods.The method is applied to the depth camera shown in Fig. 1 a.
See Fig. 2, the method comprises:
Step 101: adopt laser beam emitting device and diffraction optical element to form speckle pattern on the reference plane.
Step 102: adopt camera head shooting, collecting speckle pattern, obtain the first reference diagram.
Step 103: the mapping relations determining the white point in the pattern of spot and diffraction optical element in the first reference diagram.
Step 104: according to the pattern of mapping relations and diffraction optical element, analog synthesis depth camera reference diagram.
The embodiment of the present invention by camera head shooting, collecting speckle pattern as the first reference diagram, the mapping relations between the white point obtaining the pattern of spot and diffraction optical element in the first reference diagram.And according to these mapping relations, analog synthesis depth camera reference diagram.Effectively prevent laser rescattering assorted some problem.And without the need to adding other external equipments, reduce equipment cost.
Embodiment two
Embodiments provide a kind of depth camera reference diagram acquisition methods, the method is applied in the depth camera shown in Fig. 1 a.See Fig. 3, the method comprises:
Step 201: adopt laser beam emitting device and diffraction optical element to form speckle pattern on the reference plane.
Fig. 4 is the pattern of the present embodiment diffraction optical element.As can be seen from Figure 4, the area of each white point evenly and regular arrangement.
Step 202: adopt camera head shooting, collecting speckle pattern, obtain the first reference diagram.
Fig. 5 is the first reference diagram obtained after taking speckle pattern in the present embodiment.Can think that the spot in the first reference diagram has relation one to one with the white point of the pattern of diffraction optical element, and there are mapping relations.As can be seen from Figure 5, mix a little owing to there is laser rescattering, the speckle point in the first reference diagram is comparatively fuzzy.
In the present embodiment, adopt the mode of the speckle pattern of camera head photographing section, obtain the first reference diagram.Easily know, can take the condition of whole speckle pattern at camera head under, whole speckle pattern that also shooting, collecting can be arrived is as the first reference diagram.Then carry out follow-up identical step to process the first reference diagram acquired.
Step 203: determine spot corresponding with the appointment white point in the pattern of diffraction optical element in the first reference diagram.
In specific implementation, can take to set and specify the shape of white point and the variform mode of other white points except this indication white point, determine that the spot in the first reference diagram is corresponding with which white point in the pattern of diffraction optical element.Such as, in Fig. 4, the shape that can set a certain appointment white point is cross.Then in first reference diagram of Fig. 5, visually observe and obtain the spot corresponding with this appointment white point.
In addition, also can adopting the first localized mass of pattern by calculating diffraction optical element, with the similarity of the second localized mass in the first reference diagram, determining the corresponding relation of white point in the pattern of spot in the first reference diagram and diffraction optical element.Tolerance similarity has multiple method.Conventional the is absolute value of the point-to-point difference of calculating first localized mass and the second localized mass and, this algorithm is called as absolute difference and (SumofAbsoluteDifferences is called for short " SAD ") algorithm.When sad value is less, the similarity of the first localized mass and the second localized mass is higher.In addition, the method for more robust (i.e. ZNCC method) also can be adopted to measure similarity.
Step 204: in the same coordinate system, determines the coordinate of specifying white point and specifying spot corresponding to white point.
Step 205: according to specifying white point and specifying the coordinate of spot corresponding to white point, determine the mapping relations of spot and white point in the first reference diagram.This mapping relations comprise Linear Mapping relation or Nonlinear Mapping relation.
In specific implementation, set and specify the coordinate of white point as (x, y), the coordinate of the spot that this white point is corresponding is (xp, yp).And between white point coordinates (x, y) and spot coordinate (xp, yp), meet the projection relation of general projective theorem.
This projection relation can be expressed as, k [xp, yp, 1] t=H* [x, y, 1] t.Wherein, k is greater than 0,
[xp, yp, 1] t[x, y, 1] tbe the matrix that three row one arrange, H is the homography matrix (Homography) of 3 × 3, and H can solve corresponding points solve linear equations known above by 4.It can thus be appreciated that, in the ideal situation, between spot coordinate (xp, yp) and white point coordinates (x, y), there is Linear Mapping relation.
Except this Linear Mapping relation, due to the deformation effect of laser beam emitting device, there is radial distortion and normal direction distortion in the speckle pattern that actual projected is formed, these distortion cause actual spot coordinate (xp, yp) and between white point coordinates (x, y) there is Nonlinear Mapping relation.This Nonlinear Mapping relation can be optimized calculating by introducing nonlinear parameter.
Can be realized by abovementioned steps 203 ~ step 205: the mapping relations determining the white point in the pattern of spot and diffraction optical element in the first reference diagram.
Step 206: according to the albomaculatus coordinate of institute and the mapping relations of the pattern of diffraction optical element, determine the position of the whole spots in reference planes.
In specific implementation, first according to the albomaculatus coordinate of institute and the mapping relations of the pattern of diffraction optical element, the coordinate of the whole spots in reference planes can be determined.Then, in the specified scope around the coordinate of each spot, the position of position as each spot of brightness maximum point is searched for.Thus determine the position of the whole spots in reference planes.This way of search also can localized mass be searched for, and no longer describes in detail at this.
What deserves to be explained is, actual photographed to the first reference diagram in, be in the region unit (or block of pixels) being infinitely adjacent to brightness maximum point position, there is certain brightness value too.Just the brightness value of this region unit (or block of pixels) is slightly low.In the present embodiment, have employed and the brightness value of brightness maximum point is set to 1 (being bright), the brightness value of region unit slightly low for other brightness is set to 0 (being dark).By using the center of brightness maximum point position as spot, can locating spot more accurately, improve the accuracy simulating the reference diagram obtained.
Understandably, in other embodiments, this step 206 also can realize in the following ways: according to the albomaculatus coordinate of institute and the aforementioned mapping relations of the pattern of diffraction optical element, determine the coordinate of the whole spots in reference planes, using the coordinate of spot determined directly as the position of each spot.
Step 207: the brightness adopting each spot in two-dimensional Gaussian function convolution analog computation reference planes, simulation forms spot in the position of each spot, obtains depth camera reference diagram.
In the present embodiment, suppose that the brightness value of the spot in actual reference planes is Gaussian Profile, thus can carry out analog synthesis spot in the position of the spot of single pixel.
Fig. 6 is to the calculation procedure figure about 1 dimension convolution, and wherein (a) part is convolution kernel, can regard the Gaussian kernel as using as.Impact respective function wherein in (b) part can regard the position of spot as.Wherein (c) part is the later result of convolution, can find out that each white point position after convolution becomes the shape of convolution kernel.
Here is the example by convolution simulation hot spot.
Fig. 6 a is the binary map characterizing speckle displacement, and represent with 255 at the brightness value of speckle displacement, all the other are 0.Fig. 6 b is Gaussian convolution core used, and the upper right corner is the appearance being expressed as image.Fig. 6 c is the result of the simulated speckle pattern after convolution.
Fig. 7 and Fig. 8, is respectively the comparison diagram of a certain localized mass before and after the depth camera reference diagram acquisition methods process adopting this enforcement.Can find out from figure, the speckle pattern adopting the present embodiment simulation to be formed is more obviously clear, can be used as the reference diagram in depth camera, thus effectively calculates the degree of depth of actual scene.
Can realize at the pattern according to mapping relations and diffraction optical element by abovementioned steps 206 and step 207, analog synthesis depth camera reference diagram.
The embodiment of the present invention by camera head shooting, collecting speckle pattern as the first reference diagram, the mapping relations between the white point obtaining the pattern of spot and diffraction optical element in the first reference diagram.And according to these mapping relations, analog synthesis depth camera reference diagram.Effectively prevent laser rescattering assorted some problem.And without the need to adding other external equipments, reduce equipment cost.Further, the problem that the camera perspective shooting angle of depth camera inside is little is also solved, can disposable acquisition entire depth camera reference diagram.In addition, the capacity also due to this depth camera reference diagram is less, can obtain more quickly and store this depth camera reference diagram, improve Consumer's Experience.
Embodiment three
Embodiments provide a kind of depth camera reference diagram acquisition device, this device can be applied to depth camera as shown in Figure 1a.See Fig. 9, device comprises the first reference diagram acquisition module 301, determination module 302 and depth camera reference diagram acquisition module 303.
First reference diagram acquisition module 301, for obtaining the first reference diagram that speckle pattern that camera head shooting, collecting reference planes are formed obtains;
Determination module 302, for determining the mapping relations of the white point in the pattern of spot and diffraction optical element in the first reference diagram;
Depth camera reference diagram acquisition module 303, for the pattern according to mapping relations and diffraction optical element, analog synthesis depth camera reference diagram.
The embodiment of the present invention by camera head shooting, collecting speckle pattern as the first reference diagram, the mapping relations between the white point obtaining the pattern of spot and diffraction optical element in the first reference diagram.And according to these mapping relations, analog synthesis depth camera reference diagram.Effectively prevent laser rescattering assorted some problem.And without the need to adding other external equipments, reduce equipment cost.
Embodiment four
Embodiments provide a kind of depth camera reference diagram acquisition device, this device can be applied to depth camera as shown in Figure 1a.See Figure 10, device comprises the first reference diagram acquisition module 401, determination module 402 and depth camera reference diagram acquisition module 403.
First reference diagram acquisition module 401, for obtaining the speckle pattern that camera head shooting, collecting reference planes are formed, the first reference diagram obtained.
In the present embodiment, adopt the mode of the speckle pattern of camera head photographing section, obtain the first reference diagram.Easily know, can take the condition of whole speckle pattern at camera head under, whole speckle pattern that also shooting, collecting can be arrived is as the first reference diagram.
Determination module 402, for determining the mapping relations of the white point in the pattern of spot and diffraction optical element in the first reference diagram.
In a kind of implementation of the present invention, determination module comprises the first sub-determination module 4022 of sub-determination module 4021, second and the 3rd sub-determination module 4023.
First sub-determination module 4021, for determining spot corresponding with the appointment white point in the pattern of diffraction optical element in the first reference diagram.
In specific implementation, can take to set and specify the shape of white point and the variform mode of other white points except this indication white point, determine that the spot in the first reference diagram is corresponding with which white point in the pattern of diffraction optical element.Such as, the shape that can set a certain appointment white point is cross.Then in the first reference diagram, visually observe and obtain the spot corresponding with this appointment white point.
In addition, also can adopting the first localized mass of pattern by calculating diffraction optical element, with the similarity of the second localized mass in the first reference diagram, determining the corresponding relation of white point in the pattern of spot in the first reference diagram and diffraction optical element.Tolerance similarity has multiple method.Conventional the is absolute value of the point-to-point difference of calculating first localized mass and the second localized mass and, be called SAD algorithm.When sad value is less, the similarity of the first localized mass and the second localized mass is higher.In addition, the method for more robust (i.e. ZNCC method) also can be adopted to measure similarity.
Second sub-determination module 4022, in the same coordinate system, determines the coordinate of specifying white point and specifying spot corresponding to white point.
3rd sub-determination module 4023, for according to specifying white point and specifying the coordinate of spot corresponding to white point, determines the mapping relations of spot and white point in the first reference diagram.
Mapping relations comprise Linear Mapping relation or Nonlinear Mapping relation.Illustrate about the explanation of the mapping relations of spot and white point in the first reference diagram, see embodiment two, no longer can describe in detail at this.
Depth camera reference diagram acquisition module 403, for the pattern according to mapping relations and diffraction optical element, analog synthesis depth camera reference diagram.
In a kind of implementation of the present embodiment, depth camera reference diagram acquisition module 403 comprises speckle displacement acquisition module 4031 and spot simulation forms module 4032.
Speckle displacement acquisition module 4031, for the albomaculatus coordinate of institute and the mapping relations of the pattern according to diffraction optical element, determines the position of the whole spots in reference planes.
In specific implementation, speckle displacement acquisition module comprises spot coordinate obtaining module and speckle displacement search module.
Spot coordinate obtaining module, for the albomaculatus coordinate of institute and the mapping relations of the pattern according to diffraction optical element, determines the coordinate of the whole spots in reference planes.
Speckle displacement search module, in the specified scope around the coordinate of each spot, searches for the position of position as each spot of brightness maximum point.By the position adopting speckle displacement acquisition module can determine the whole spots in reference planes.This way of search also can localized mass be searched for, and no longer describes in detail at this.
What deserves to be explained is, actual photographed to the first reference diagram in, be in the region unit (or block of pixels) being infinitely adjacent to brightness maximum point position, there is certain brightness value too.Just the brightness value of this region unit (or block of pixels) is slightly low.In the present embodiment, have employed and the brightness value of brightness maximum point is set to 1 (being bright), the brightness value of region unit slightly low for other brightness is set to 0 (being dark).By using the center of brightness maximum point position as spot, can locating spot more accurately, improve the accuracy simulating the reference diagram obtained.
Spot simulation forms module 4032, and for adopting the brightness of each spot in two-dimensional Gaussian function convolution analog computation reference planes, in the position of each spot, simulation forms spot.Thus the depth camera reference diagram required for formation the present embodiment.
The embodiment of the present invention by camera head shooting, collecting speckle pattern as the first reference diagram, the mapping relations between the white point obtaining the pattern of spot and diffraction optical element in the first reference diagram.And according to these mapping relations, analog synthesis depth camera reference diagram.Effectively prevent laser rescattering assorted some problem.And without the need to adding other external equipments, reduce equipment cost.Further, the problem that the camera perspective shooting angle of depth camera inside is little is also solved, can disposable acquisition entire depth camera reference diagram.In addition, the capacity also due to this depth camera reference diagram is less, can obtain more quickly and store this depth camera reference diagram, improve Consumer's Experience.
The invention described above embodiment sequence number, just to describing, does not represent the quality of embodiment.
One of ordinary skill in the art will appreciate that all or part of step realizing above-described embodiment can have been come by hardware, the hardware that also can carry out instruction relevant by program completes, described program can be stored in a kind of computer-readable recording medium, the above-mentioned storage medium mentioned can be read-only memory, disk or CD etc.
The foregoing is only preferred embodiment of the present invention, not in order to limit the present invention, within the spirit and principles in the present invention all, any amendment done, equivalent replacement, improvement etc., all should be included within protection scope of the present invention.

Claims (12)

1. a depth camera reference diagram acquisition methods, described depth camera comprises laser beam emitting device, diffraction optical element and camera head, described laser beam emitting device is for Emission Lasers and described laser is projected reference planes by described diffraction optical element, the speckle pattern that described camera head is formed in described reference planes for taking described laser, it is characterized in that, the pattern of described diffraction optical element comprises multiple white point, described laser forms a spot respectively after white point described in each in described reference planes, and described method comprises:
Described laser beam emitting device and described diffraction optical element is adopted to form described speckle pattern in described reference planes;
Adopt speckle pattern described in described camera head shooting, collecting, obtain the first reference diagram;
Determine the mapping relations of the described white point in the pattern of spot and described diffraction optical element in described first reference diagram;
According to the pattern of described mapping relations and described diffraction optical element, analog synthesis depth camera reference diagram.
2. the method for claim 1, is characterized in that, the mapping relations of the described white point in the described pattern determining spot and described diffraction optical element in described first reference diagram, comprising:
Determine spot corresponding with the appointment white point in the pattern of described diffraction optical element in described first reference diagram;
In the same coordinate system, determine the coordinate of described appointment white point and spot corresponding to described appointment white point;
According to the coordinate of described appointment white point and spot corresponding to described appointment white point, determine the mapping relations of spot and described white point in described first reference diagram.
3. method as claimed in claim 2, it is characterized in that, the shape of described appointment white point is different from the shape of other white points except described indication white point.
4. the method for claim 1, is characterized in that, the described pattern according to described mapping relations and described diffraction optical element, and analog synthesis depth camera reference diagram, comprising:
According to the albomaculatus coordinate of institute and the described mapping relations of the pattern of described diffraction optical element, determine the position of the whole spots in described reference planes;
Adopt the brightness of each spot in reference planes described in the analog computation of two-dimensional Gaussian function convolution, described in each, the position simulation of spot forms spot, obtains described depth camera reference diagram.
5. method as claimed in claim 4, is characterized in that, the albomaculatus coordinate of institute of the described pattern according to described diffraction optical element and described mapping relations, determines the position of the whole spots in described reference planes, comprising:
According to the albomaculatus coordinate of institute and the described mapping relations of the pattern of described diffraction optical element, determine the coordinate of the whole spots in described reference planes;
Described in each spot coordinate around specified scope in, the position of search brightness maximum point is as the position of spot described in each.
6. the method for claim 1, is characterized in that, described mapping relations comprise Linear Mapping relation or Nonlinear Mapping relation.
7. a depth camera reference diagram acquisition device, described depth camera comprises laser beam emitting device, diffraction optical element and camera head, described laser beam emitting device is for Emission Lasers and described laser is projected reference planes by described diffraction optical element, the speckle pattern that described camera head is formed in described reference planes for taking described laser, it is characterized in that, the pattern of described diffraction optical element comprises multiple white point, described laser forms a spot respectively after white point described in each in described reference planes, and described device comprises:
First reference diagram acquisition module, for obtaining the first reference diagram that speckle pattern that reference planes described in described camera head shooting, collecting are formed obtains;
Determination module, for determining the mapping relations of the described white point in the pattern of spot and described diffraction optical element in described first reference diagram;
Depth camera reference diagram acquisition module, for the pattern according to described mapping relations and described diffraction optical element, analog synthesis depth camera reference diagram.
8. device as claimed in claim 7, it is characterized in that, described determination module comprises:
First sub-determination module, for determining spot corresponding with the appointment white point in the pattern of described diffraction optical element in described first reference diagram;
Second sub-determination module, in the same coordinate system, determines the coordinate of described appointment white point and spot corresponding to described appointment white point;
3rd sub-determination module, for the coordinate according to described appointment white point and spot corresponding to described appointment white point, determines the mapping relations of spot and described white point in described first reference diagram.
9. device as claimed in claim 8, it is characterized in that, the shape of described appointment white point is different from the shape of other white points except described indication white point.
10. device as claimed in claim 7, it is characterized in that, described depth camera reference diagram acquisition module comprises:
Speckle displacement acquisition module, for the albomaculatus coordinate of institute and the described mapping relations of the pattern according to described diffraction optical element, determines the position of the whole spots in described reference planes;
Spot simulation forms module, and for adopting the brightness of each spot in reference planes described in the analog computation of two-dimensional Gaussian function convolution, described in each, the position simulation of spot forms spot.
11. devices as claimed in claim 10, it is characterized in that, described speckle displacement acquisition module comprises:
Spot coordinate obtaining module, for the albomaculatus coordinate of institute and the described mapping relations of the pattern according to described diffraction optical element, determines the coordinate of the whole spots in described reference planes;
Speckle displacement search module, in the specified scope around the coordinate of spot described in each, searches for the position of position as spot described in each of brightness maximum point.
12. devices as claimed in claim 7, it is characterized in that, described mapping relations comprise Linear Mapping relation or Nonlinear Mapping relation.
CN201410334261.7A 2014-07-14 2014-07-14 Acquisition methods and device of a kind of depth camera with reference to figure Active CN105306922B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410334261.7A CN105306922B (en) 2014-07-14 2014-07-14 Acquisition methods and device of a kind of depth camera with reference to figure

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410334261.7A CN105306922B (en) 2014-07-14 2014-07-14 Acquisition methods and device of a kind of depth camera with reference to figure

Publications (2)

Publication Number Publication Date
CN105306922A true CN105306922A (en) 2016-02-03
CN105306922B CN105306922B (en) 2017-09-29

Family

ID=55203602

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410334261.7A Active CN105306922B (en) 2014-07-14 2014-07-14 Acquisition methods and device of a kind of depth camera with reference to figure

Country Status (1)

Country Link
CN (1) CN105306922B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106406002A (en) * 2016-10-28 2017-02-15 深圳奥比中光科技有限公司 Planar array projection device and depth camera
CN106569330A (en) * 2016-10-28 2017-04-19 深圳奥比中光科技有限公司 Design method of optical pattern, an area array projection device and depth camera
CN107564051A (en) * 2017-09-05 2018-01-09 歌尔股份有限公司 A kind of depth information acquisition method and system
CN108227232A (en) * 2016-12-14 2018-06-29 浙江舜宇智能光学技术有限公司 The diverging light formula speckle projector and its focus adjustment method and three-dimensional reconstruction system
CN109167904A (en) * 2018-10-31 2019-01-08 Oppo广东移动通信有限公司 Image acquiring method, image acquiring device, structure optical assembly and electronic device
CN109167903A (en) * 2018-10-31 2019-01-08 Oppo广东移动通信有限公司 Image acquiring method, image acquiring device, structure optical assembly and electronic device
CN109167905A (en) * 2018-10-31 2019-01-08 Oppo广东移动通信有限公司 Image acquiring method, image acquiring device, structure optical assembly and electronic device
CN110059537A (en) * 2019-02-27 2019-07-26 视缘(上海)智能科技有限公司 A kind of three-dimensional face data acquisition methods and device based on Kinect sensor
WO2019174455A1 (en) * 2018-03-12 2019-09-19 Oppo广东移动通信有限公司 Laser projection module and detection method and apparatus therefor, and depth camera module and electronic apparatus
WO2022022136A1 (en) * 2020-07-28 2022-02-03 腾讯科技(深圳)有限公司 Depth image generation method and apparatus, reference image generation method and apparatus, electronic device, and computer readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101825431A (en) * 2009-03-05 2010-09-08 普莱姆森斯有限公司 Reference image techniques for three-dimensional sensing
CN102999910A (en) * 2012-11-27 2013-03-27 西安交通大学 Image depth calculating method
WO2013156530A1 (en) * 2012-04-18 2013-10-24 3Shape A/S 3d scanner using merged partial images
CN103796001A (en) * 2014-01-10 2014-05-14 深圳奥比中光科技有限公司 Method and device for synchronously acquiring depth information and color information

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101825431A (en) * 2009-03-05 2010-09-08 普莱姆森斯有限公司 Reference image techniques for three-dimensional sensing
WO2013156530A1 (en) * 2012-04-18 2013-10-24 3Shape A/S 3d scanner using merged partial images
CN102999910A (en) * 2012-11-27 2013-03-27 西安交通大学 Image depth calculating method
CN103796001A (en) * 2014-01-10 2014-05-14 深圳奥比中光科技有限公司 Method and device for synchronously acquiring depth information and color information

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106569330A (en) * 2016-10-28 2017-04-19 深圳奥比中光科技有限公司 Design method of optical pattern, an area array projection device and depth camera
CN106406002B (en) * 2016-10-28 2018-05-04 深圳奥比中光科技有限公司 Area array projection device and depth camera
CN106406002A (en) * 2016-10-28 2017-02-15 深圳奥比中光科技有限公司 Planar array projection device and depth camera
CN106569330B (en) * 2016-10-28 2019-07-12 深圳奥比中光科技有限公司 A kind of design method of optical design, area array projection device and a kind of depth camera
CN108227232A (en) * 2016-12-14 2018-06-29 浙江舜宇智能光学技术有限公司 The diverging light formula speckle projector and its focus adjustment method and three-dimensional reconstruction system
CN107564051A (en) * 2017-09-05 2018-01-09 歌尔股份有限公司 A kind of depth information acquisition method and system
CN107564051B (en) * 2017-09-05 2020-06-02 歌尔股份有限公司 Depth information acquisition method and system
WO2019174455A1 (en) * 2018-03-12 2019-09-19 Oppo广东移动通信有限公司 Laser projection module and detection method and apparatus therefor, and depth camera module and electronic apparatus
US11563930B2 (en) 2018-03-12 2023-01-24 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Laser projection component, detection method thereof, and electronic device
CN109167903A (en) * 2018-10-31 2019-01-08 Oppo广东移动通信有限公司 Image acquiring method, image acquiring device, structure optical assembly and electronic device
CN109167904B (en) * 2018-10-31 2020-04-28 Oppo广东移动通信有限公司 Image acquisition method, image acquisition device, structured light assembly and electronic device
CN109167905A (en) * 2018-10-31 2019-01-08 Oppo广东移动通信有限公司 Image acquiring method, image acquiring device, structure optical assembly and electronic device
CN109167904A (en) * 2018-10-31 2019-01-08 Oppo广东移动通信有限公司 Image acquiring method, image acquiring device, structure optical assembly and electronic device
CN110059537A (en) * 2019-02-27 2019-07-26 视缘(上海)智能科技有限公司 A kind of three-dimensional face data acquisition methods and device based on Kinect sensor
WO2022022136A1 (en) * 2020-07-28 2022-02-03 腾讯科技(深圳)有限公司 Depth image generation method and apparatus, reference image generation method and apparatus, electronic device, and computer readable storage medium

Also Published As

Publication number Publication date
CN105306922B (en) 2017-09-29

Similar Documents

Publication Publication Date Title
CN105306922A (en) Method and device for obtaining depth camera reference diagram
CN103649674B (en) Measuring equipment and messaging device
JP6394005B2 (en) Projection image correction apparatus, method and program for correcting original image to be projected
US9759548B2 (en) Image processing apparatus, projector and projector system including image processing apparatus, image processing method
JP6475311B1 (en) Optical tracking system and optical tracking method
JP2021507440A (en) Methods and systems for generating 3D images of objects
JP5633058B1 (en) 3D measuring apparatus and 3D measuring method
JP6800597B2 (en) Controls, control methods and programs
JP2008537190A (en) Generation of three-dimensional image of object by irradiating with infrared pattern
CN107025663A (en) It is used for clutter points-scoring system and method that 3D point cloud is matched in vision system
CN111161358B (en) Camera calibration method and device for structured light depth measurement
CN107808398B (en) Camera parameter calculation device, calculation method, program, and recording medium
JP2012058076A (en) Three-dimensional measurement device and three-dimensional measurement method
US10186051B2 (en) Method and system for calibrating a velocimetry system
CN105378794A (en) 3d recording device, method for producing 3d image, and method for setting up 3d recording device
US10713810B2 (en) Information processing apparatus, method of controlling information processing apparatus, and storage medium
JP2011242183A (en) Image processing device, image processing method, and program
JP7378219B2 (en) Imaging device, image processing device, control method, and program
JP2016024052A (en) Three-dimensional measurement system, three-dimensional measurement method and program
JP7163025B2 (en) Image measuring device, image measuring method, imaging device, program
JP4193342B2 (en) 3D data generator
WO2021022775A1 (en) Depth image generation method, apparatus and device, and computer-readable storage medium
JP2018179577A (en) Position measuring device
US20210183092A1 (en) Measuring apparatus, measuring method and microscope system
WO2022254854A1 (en) Three-dimensional measurement device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant