CN105306922B - Acquisition methods and device of a kind of depth camera with reference to figure - Google Patents

Acquisition methods and device of a kind of depth camera with reference to figure Download PDF

Info

Publication number
CN105306922B
CN105306922B CN201410334261.7A CN201410334261A CN105306922B CN 105306922 B CN105306922 B CN 105306922B CN 201410334261 A CN201410334261 A CN 201410334261A CN 105306922 B CN105306922 B CN 105306922B
Authority
CN
China
Prior art keywords
spot
optical element
pattern
white point
diffraction optical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410334261.7A
Other languages
Chinese (zh)
Other versions
CN105306922A (en
Inventor
王琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201410334261.7A priority Critical patent/CN105306922B/en
Publication of CN105306922A publication Critical patent/CN105306922A/en
Application granted granted Critical
Publication of CN105306922B publication Critical patent/CN105306922B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Measurement Of Optical Distance (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Picture capturing method is referred to the invention discloses a kind of depth camera, belongs to technical field of image processing.Method includes:Speckle pattern is formed using laser beam emitting device and diffraction optical element on the reference plane;Using camera device shooting, collecting speckle pattern, first is obtained with reference to figure;Determine first with reference to spot in figure and diffraction optical element pattern in white point mapping relations;According to mapping relations and the pattern of diffraction optical element, analog synthesis depth camera is with reference to figure.The depth camera that the present invention is provided refer to picture capturing method and device, by camera device shooting, collecting speckle pattern be used as first refer to figure, acquisition first with reference to spot in figure and the pattern of diffraction optical element white point between mapping relations.And according to the mapping relations, analog synthesis depth camera is with reference to figure.It effectively prevent miscellaneous problem of laser rescattering.And other external equipments need not be added, reduce equipment cost.A kind of depth camera is separately also disclosed with reference to figure acquisition device.

Description

Acquisition methods and device of a kind of depth camera with reference to figure
Technical field
The present invention relates to technical field of image processing, acquisition methods and dress of more particularly to a kind of depth camera with reference to figure Put.
Background technology
Current image processing techniques is that image is processed, to meet human psychology, vision or other application demand A kind of processing behavior.That is not asked with people with the development of science and technology is continuously increased, the application day of image processing techniques It is beneficial extensive.
Depth camera is gradually shown up prominently due to the ranging of 3-D view can be achieved.The key of existing depth camera One of technology is obtained with reference to figure.According to the position with reference to each speckle in figure, each point of other in real space is calculated Spatial value.Current depth camera acquisition is that laser projector is sent out by diffraction optical element module with reference to the method for figure Go out tens of thousands of laser beams, project on a flat surface, form speckle point set.The pattern of speckle pattern is by diffraction optical element Pattern frame is etched to determine.Camera inside depth camera shoots the speckle point set on flat surface, and store photo in As with reference to figure in depth camera.This is used for the depth map for calculating actual scene with reference to figure.
But, the problem of there is miscellaneous of laser rescattering in the above method:The physical property of actual flat surface, it is necessary to It is the surface for having certain roughness so that phase chance collects surface scattering, the scattering is referred to as " rescattering speckle ".At present, In order to solve the above problems, secondary speckle can be suppressed using the method for horizontal movement camera plane, but take horizontal movement to shoot When, guide rail and camera device need to be added, equipment cost is added.
The content of the invention
In order to overcome above mentioned problem, the invention provides acquisition methods and device of a kind of depth camera with reference to figure.It is described Technical scheme is as follows:
On the one hand, picture capturing method is referred to the invention provides a kind of depth camera, the depth camera is sent out including laser Injection device, diffraction optical element and camera device, the laser beam emitting device are used to launch laser and the laser is passed through into institute State diffraction optical element to project in reference planes, the camera device is used to shoot laser shape in the reference planes Into speckle pattern, the pattern of the diffraction optical element includes multiple white points, and the laser is after each described white point point A spot is not formed in the reference planes, methods described includes:
The speckle pattern is formed in the reference planes using the laser beam emitting device and the diffraction optical element Sample;
Using speckle pattern described in the camera device shooting, collecting, first is obtained with reference to figure;
Determine described first with reference to spot in figure and the diffraction optical element pattern in the mapping of the white point close System;
According to the mapping relations and the pattern of the diffraction optical element, analog synthesis depth camera is with reference to figure.
Further, it is described determine described first with reference to spot in figure with it is described in the pattern of the diffraction optical element The mapping relations of white point, including:
Described first is determined with reference to spot corresponding with the specified white point in the pattern of the diffraction optical element in figure;
In the same coordinate system, the coordinate of the specified white point and the corresponding spot of the specified white point is determined;
According to the specified white point and the coordinate of the corresponding spot of the specified white point, described first is determined with reference in figure The mapping relations of spot and the white point.
In a kind of enforceable mode, the shape of the specified white point and other white points in addition to the indication white point Shape is different.
Further, it is described according to the mapping relations and the pattern of the diffraction optical element, analog synthesis depth phase Machine with reference to figure, including:
According to the albomaculatus coordinate of institute of the pattern of the diffraction optical element and the mapping relations, the reference is determined The position of whole spots in plane;
The brightness for calculating each spot in the reference planes is simulated using two-dimensional Gaussian function convolution, described in each The position of spot simulates to form spot, obtains the depth camera with reference to figure.
Further, the albomaculatus coordinate of institute of the pattern according to the diffraction optical element and the mapping are closed System, determines the position of whole spots in the reference planes, including:
According to the albomaculatus coordinate of institute of the pattern of the diffraction optical element and the mapping relations, the reference is determined The coordinate of whole spots in plane;
In the specified range around the coordinate of each spot, the position of brightness maximum point is searched for as described in each The position of spot.
Further, the mapping relations include linear mapping relation or Nonlinear Mapping relation.
On the other hand, the invention provides a kind of depth camera with reference to figure acquisition device, the depth camera includes laser Emitter, diffraction optical element and camera device, the laser beam emitting device are used to launch laser and pass through the laser The diffraction optical element is projected in reference planes, and the camera device is used to shoot the laser in the reference planes The speckle pattern of formation, the pattern of the diffraction optical element includes multiple white points, and the laser is after each described white point A spot is formed in the reference planes respectively, described device includes:
First with reference to figure acquisition module, and what is formed for obtaining described in the camera device shooting, collecting in reference planes dissipates Spot pattern obtain first with reference to figure;
Determining module, for determining described first with reference to described in the pattern of spot in figure and the diffraction optical element The mapping relations of white point;
Depth camera is with reference to figure acquisition module, for the pattern according to the mapping relations and the diffraction optical element, Analog synthesis depth camera is with reference to figure.
Further, the determining module includes:
First sub- determining module, for determine described first with reference in figure with the finger in the pattern of the diffraction optical element The corresponding spot of ding white ware point;
Second sub- determining module, in the same coordinate system, determining the specified white point and the specified white point pair The coordinate for the spot answered;
3rd sub- determining module, for the coordinate according to the specified white point and the corresponding spot of the specified white point, Determine described first with reference to spot in figure and the white point mapping relations.
In a kind of enforceable mode, the shape of the specified white point and other white points in addition to the indication white point Shape is different.
Further, the depth camera includes with reference to figure acquisition module:
Speckle displacement acquisition module, for the albomaculatus coordinate of institute of the pattern according to the diffraction optical element and described Mapping relations, determine the position of whole spots in the reference planes;
Spot simulates to form module, calculates each in the reference planes for being simulated using two-dimensional Gaussian function convolution The brightness of spot, simulates to form spot in the position of each spot.
Further, the speckle displacement acquisition module includes:
Spot coordinate obtaining module, for the albomaculatus coordinate of institute of the pattern according to the diffraction optical element and described Mapping relations, determine the coordinate of whole spots in the reference planes;
Speckle displacement search module, in the specified range around the coordinate of each spot, search brightness to be most A little bigger position as spot each described position.
Further, the mapping relations include linear mapping relation or Nonlinear Mapping relation.
The present invention is used as first with reference to figure by camera device shooting, collecting speckle pattern, obtains first with reference to spot in figure Mapping relations between the white point of the pattern of diffraction optical element.And according to the mapping relations, analog synthesis depth camera ginseng Examine figure.It effectively prevent miscellaneous problem of laser rescattering.And other external equipments need not be added, reduce equipment cost.
Brief description of the drawings
Fig. 1 a are the structural representations of depth camera provided in an embodiment of the present invention;
Fig. 1 b are the fundamental diagrams of depth camera provided in an embodiment of the present invention;
Fig. 2 is that the depth camera provided in the embodiment of the present invention one refers to the flow chart of picture capturing method;
Fig. 3 is that another depth camera provided in the embodiment of the present invention two refers to the flow chart of picture capturing method;
Fig. 4 is the pattern of diffraction optical element in the embodiment of the present invention two;
Fig. 5 be shoot in the embodiment of the present invention two obtained after speckle pattern first with reference to figure;
Fig. 6 is convolutional calculation block diagram in the embodiment of the present invention two;
The step of Fig. 6 a, 6b, 6c are and simulate hot spot by convolution in the embodiment of the present invention two is schemed;
Fig. 7 is the pattern for the localized mass that the depth camera for implementing two using the present invention refers to picture capturing method before processing;
Fig. 8 is the pattern after being handled based on the localized mass shown in Fig. 7 through depth camera with reference to picture capturing method;
Fig. 9 is the module map of the depth camera that provides in the embodiment of the present invention three with reference to figure acquisition device;
Figure 10 is the module map of the depth camera that provides in the embodiment of the present invention four with reference to figure acquisition device.
Embodiment
To make the object, technical solutions and advantages of the present invention clearer, below in conjunction with accompanying drawing to embodiment party of the present invention Formula is described in further detail.
For the ease of understanding the embodiment of the present invention, the structure and working principle of depth camera is entered with reference to Fig. 1 a first below Row is simple to be introduced.As shown in Figure 1a, depth camera includes laser beam emitting device 11, diffraction optical element (Diffractive Optical Elements, abbreviation DOE) 12 and camera device 13.Laser beam emitting device 11 is used to launch laser and lead to laser Diffraction optical element 12 is crossed to project in reference planes 14.Reference planes 14 are a smooth plane.Camera device 13 is used for Shoot the speckle pattern that laser is formed in reference planes 14.The pattern of diffraction optical element 12 includes multiple white points, laser warp Cross after each white point and to form a spot in reference planes 14 respectively (diffraction optical element 12 is equivalent to being covered in Laser emission Multiple white points are set on sheet glass on device 11, sheet glass).
It is considered that the spot in reference planes 14 and the white point on diffraction optical element 12 are one-to-one.Per pass leads to The laser of white point is crossed after collimation, deformation is had no and dissipates, and finally falls one spot of formation in reference planes 14.It is all The set of spot constitutes speckle pattern.Thus, it can be known that the figure of speckle pattern is determined by the pattern of diffraction optical element 12.
When the laser of the transmitting of laser beam emitting device 11 is after diffraction optical element 12, tens of thousands of laser beams, projection are formed Onto reference planes 14, speckle pattern is formed.Camera device 13 shoots the speckle pattern, is carried out as depth camera with reference to figure Storage, this is used as calculating the reference pattern of the depth map of actual scene in real time with reference to figure.
The principle that depth camera calculates the depth map of actual scene is the method based on triangulation.Referring to Fig. 1 b, laser It is Z1, Z2 Different Plane that the laser beam (by taking a light beam as an example) of emitter 11 (such as projecting apparatus) transmitting, which is radiated at distance, On, wherein Z1 is reference planes.On the imaging sensor (Charge-coupled Device, abbreviation CCD) of camera device 13 The corresponding hot spot point of imaging can produce the horizontal displacement from Xc1 to Xc2.It is flat in known reference according to the principle of triangulation During face Z1 actual distance value (i.e. a constant Z1), by detecting same Ray Of Light in reference planes Z1 and any distance plane Zk The displacement that two spots of upper formation are imaged on CCD, can deduce Zk actual distance value.
That is, reference planes Z1 speckle pattern must be stored in depth camera, and known reference planes Z1 real Border distance value.When inputting any distance plane Zk image, by detecting the spot shift amount in camera device 13 on CCD just Ranging can be realized.
Embodiment 1
Picture capturing method is referred to the embodiments of the invention provide a kind of depth camera.This method is applied to the depth shown in Fig. 1 a Spend camera.
Referring to Fig. 2, this method includes:
Step 101:Speckle pattern is formed using laser beam emitting device and diffraction optical element on the reference plane.
Step 102:Using camera device shooting, collecting speckle pattern, first is obtained with reference to figure.
Step 103:Determine first with reference to spot in figure and diffraction optical element pattern in white point mapping relations.
Step 104:According to mapping relations and the pattern of diffraction optical element, analog synthesis depth camera is with reference to figure.
The embodiment of the present invention is used as first with reference to figure by camera device shooting, collecting speckle pattern, obtains first with reference to figure Mapping relations between the white point of the pattern of middle spot and diffraction optical element.And according to the mapping relations, analog synthesis depth Camera is with reference to figure.It effectively prevent miscellaneous problem of laser rescattering.And other external equipments need not be added, reduce equipment into This.
Embodiment two
Picture capturing method is referred to the embodiments of the invention provide a kind of depth camera, this method is applied to the depth shown in Fig. 1 a Spend in camera.Referring to Fig. 3, this method includes:
Step 201:Speckle pattern is formed using laser beam emitting device and diffraction optical element on the reference plane.
Fig. 4 is the pattern of the present embodiment diffraction optical element.As can be seen from Figure 4, the area of each white point is uniform and regular Arrangement.
Step 202:Using camera device shooting, collecting speckle pattern, first is obtained with reference to figure.
Fig. 5 be shoot in the present embodiment obtained after speckle pattern first with reference to figure.It is believed that first with reference in figure Spot, has one-to-one relation, and there are mapping relations with the white point of the pattern of diffraction optical element.Can from Fig. 5 Go out, due to there is the miscellaneous point of laser rescattering, first more obscures with reference to the speckle point in figure.
In the present embodiment, by the way of the speckle pattern of camera device photographing section, first is obtained with reference to figure.Easily Know, under conditions of can shooting whole speckle patterns in camera device, the whole speckle patterns that shooting, collecting can be also arrived As first with reference to figure.Follow-up identical step is then carried out to handle with reference to figure acquire first.
Step 203:First is determined with reference to spot corresponding with the specified white point in the pattern of diffraction optical element in figure.
In the specific implementation, setting can be taken to specify the shape of white point and the shape of other white points in addition to the indication white point Different mode, to determine that first is corresponding with which white point in the pattern of diffraction optical element with reference to the spot in figure.Example Such as, in Fig. 4, can set a certain specified white point is shaped as cross.Then visually observe and obtain first with reference in figure in Fig. 5 The corresponding spot of white point is specified with this.
In addition, can also use by the first localized mass of the pattern for calculating diffraction optical element, with first with reference in figure the The similarity of two localized masses, to determine first with reference to corresponding pass of the spot in figure with white point in the pattern of diffraction optical element System.Measurement similarity has a variety of methods.Conventional is calculate the first localized mass and the second localized mass point-to-point poor absolute The sum of value, the algorithm is referred to as absolute difference and (Sum of Absolute Differences, referred to as " SAD ") algorithm.Work as SAD When value is smaller, the similarity of the first localized mass and the second localized mass is higher.In addition, can also use more robust method (i.e. ZNCC Method) measurement similarity.
Step 204:In the same coordinate system, it is determined that specifying the coordinate of white point and the corresponding spot of specified white point.
Step 205:According to specified white point and the coordinate of the corresponding spot of specified white point, first is determined with reference to spot in figure With the mapping relations of white point.This mapping relations include linear mapping relation or Nonlinear Mapping relation.
Specifying the coordinate of white point as (x, y) in the specific implementation, setting, the coordinate of the corresponding spot of the white point be (xp, yp).And meet the projection relation of general projective theorem between white point coordinates (x, y) and spot coordinate (xp, yp).
This projection relation is represented by, k [xp, yp, 1]T=H* [x, y, 1]T.Wherein, k is more than 0,
[xp,yp,1]T[x, y, 1]TIt is the matrix of three rows one row, H is 3 × 3 homography matrix (Homography), H More than 4 pairs known corresponding points solution Solving Linears can be passed through.It follows that in the ideal situation, spot coordinate There is linear mapping relation between (xp, yp) and white point coordinates (x, y).
In addition to the linear mapping relation, due to the deformation effect of laser beam emitting device, the speckle of actual projected formation There is radial distortion and normal direction distortion in pattern, these distortion cause actual spot coordinate (xp, yp) and white point coordinates (x, y) it Between there is Nonlinear Mapping relation.This Nonlinear Mapping relation can optimize calculating by introducing nonlinear parameter.
It can be achieved by 203~step 205 of abovementioned steps:First is determined with reference to spot in figure and diffraction optical element The mapping relations of white point in pattern.
Step 206:According to the albomaculatus coordinate of institute and mapping relations of the pattern of diffraction optical element, reference planes are determined On whole spots position.
In the specific implementation, can first according to the albomaculatus coordinate of institute and mapping relations of the pattern of diffraction optical element, Determine the coordinate of whole spots in reference planes.Then in the specified range around the coordinate of each spot, brightness is searched for The position of maximum point as each spot position.So that it is determined that the position of the whole spots in reference planes.The searcher Formula also can local block search, no longer describe in detail herein.
What deserves to be explained is, in actually photographed first with reference in figure, in being infinitely adjacent to brightness maximum point position Region unit (or block of pixels), similarly with certain brightness value.Simply the brightness value of the region unit (or block of pixels) is slightly lower. In the present embodiment, employ and the brightness value of brightness maximum point is set to 1 (as bright), by the bright of the slightly lower region unit of other brightness Angle value is set to 0 (being dark).By using brightness maximum point position as the center of spot, can more accurately locating spot, carry The accuracy with reference to figure that height simulation is obtained.
It is to be appreciated that in other embodiments, the step 206 can also be realized in the following ways:According to diffraction light The albomaculatus coordinate of institute and foregoing mapping relations of the pattern of element are learned, the coordinate of whole spots in reference planes is determined, will Position of the coordinate for the spot determined directly as each spot.
Step 207:The brightness for calculating each spot in reference planes is simulated using two-dimensional Gaussian function convolution, at each The position of spot simulates to form spot, obtains depth camera with reference to figure.
In this example, it is assumed that the brightness value of the spot in actual reference planes is in Gaussian Profile, thus can be with Analog synthesis spot is carried out at the position of the spot of single pixel.
Fig. 6 is to the calculation procedure figure on 1 dimension convolution, wherein (a) is partly convolution kernel, the Gauss that can regard as to use Core.Impact respective function wherein in (b) part can regard the position of spot as.Wherein (c) is partly the later knot of convolution Really, it can be seen that each white point position after convolution becomes the shape of convolution kernel.
Here is the example that hot spot is simulated by convolution.
Fig. 6 a are the binary maps for characterizing speckle displacement, are represented in the brightness value of speckle displacement with 255, remaining is 0.Fig. 6 b are Gaussian convolution core used, the upper right corner is the appearance for being expressed as image.Fig. 6 c are the results of the simulated speckle pattern after convolution.
Fig. 7 and Fig. 8, respectively refers to a certain localized mass of picture capturing method before and after the processing using the depth camera of this implementation Comparison diagram.It can be seen that from figure, the speckle pattern to be formed simulated using the present embodiment and becomes apparent from substantially, can be used as depth phase In machine with reference to figure so that effectively calculate actual scene depth.
It is that may be implemented in the pattern according to mapping relations and diffraction optical element, mould by abovementioned steps 206 and step 207 Depth camera is fitted to reference to figure.
The embodiment of the present invention is used as first with reference to figure by camera device shooting, collecting speckle pattern, obtains first with reference to figure Mapping relations between the white point of the pattern of middle spot and diffraction optical element.And according to the mapping relations, analog synthesis depth Camera is with reference to figure.It effectively prevent miscellaneous problem of laser rescattering.And other external equipments need not be added, reduce equipment into This.Also, the problem of camera perspective shooting angle inside depth camera is small is also solved, entire depth phase can be disposably obtained Machine is with reference to figure.In addition, due also to the depth camera is smaller with reference to the capacity of figure, can more quickly obtain and store the depth phase Machine improves Consumer's Experience with reference to figure.
Embodiment three
The embodiments of the invention provide a kind of depth camera with reference to figure acquisition device, the device can apply to such as Fig. 1 a institutes The depth camera shown.Referring to Fig. 9, device includes first and referred to reference to figure acquisition module 301, determining module 302 and depth camera Figure acquisition module 303.
First with reference to figure acquisition module 301, for obtaining the speckle pattern formed in camera device shooting, collecting reference planes Sample obtain first with reference to figure;
Determining module 302, for determine first with reference to spot in figure and diffraction optical element pattern in white point reflect Penetrate relation;
Depth camera is with reference to figure acquisition module 303, and for the pattern according to mapping relations and diffraction optical element, simulation is closed Into depth camera with reference to figure.
The embodiment of the present invention is used as first with reference to figure by camera device shooting, collecting speckle pattern, obtains first with reference to figure Mapping relations between the white point of the pattern of middle spot and diffraction optical element.And according to the mapping relations, analog synthesis depth Camera is with reference to figure.It effectively prevent miscellaneous problem of laser rescattering.And other external equipments need not be added, reduce equipment into This.
Example IV
The embodiments of the invention provide a kind of depth camera with reference to figure acquisition device, the device can apply to such as Fig. 1 a institutes The depth camera shown.Referring to Figure 10, device includes first and referred to reference to figure acquisition module 401, determining module 402 and depth camera Figure acquisition module 403.
First with reference to figure acquisition module 401, for obtaining the speckle pattern formed in camera device shooting, collecting reference planes Sample, first obtained is with reference to figure.
In the present embodiment, by the way of the speckle pattern of camera device photographing section, first is obtained with reference to figure.Easily Know, under conditions of can shooting whole speckle patterns in camera device, the whole speckle patterns that shooting, collecting can be also arrived As first with reference to figure.
Determining module 402, for determine first with reference to spot in figure and diffraction optical element pattern in white point reflect Penetrate relation.
In a kind of implementation of the present invention, determining module includes the first sub- of determining module 4021, second and determines mould The sub- determining module 4023 of block 4022 and the 3rd.
First sub- determining module 4021, for determine first with reference in figure with the pattern of diffraction optical element it is specified in vain The corresponding spot of point.
In the specific implementation, setting can be taken to specify the shape of white point and the shape of other white points in addition to the indication white point Different mode, to determine that first is corresponding with which white point in the pattern of diffraction optical element with reference to the spot in figure.Example Such as, can set a certain specified white point is shaped as cross.Then first with reference in figure, visually observe obtain and this specify white point Corresponding spot.
In addition, can also use by the first localized mass of the pattern for calculating diffraction optical element, with first with reference in figure the The similarity of two localized masses, to determine first with reference to corresponding pass of the spot in figure with white point in the pattern of diffraction optical element System.Measurement similarity has a variety of methods.Conventional is calculate the first localized mass and the second localized mass point-to-point poor absolute The sum of value, referred to as SAD algorithms.When sad value is smaller, the similarity of the first localized mass and the second localized mass is higher.In addition, may be used also Similarity is measured using more robust method (i.e. ZNCC methods).
Second sub- determining module 4022, in the same coordinate system, it is determined that specifying white point and specified white point corresponding The coordinate of spot.
3rd sub- determining module 4023, for the coordinate according to specified white point and the corresponding spot of specified white point, it is determined that First with reference to spot in figure and white point mapping relations.
Mapping relations include linear mapping relation or Nonlinear Mapping relation.On first with reference to spot in figure and white point The explanation of mapping relations, reference can be made to embodiment two, is no longer described in detail herein.
Depth camera is with reference to figure acquisition module 403, and for the pattern according to mapping relations and diffraction optical element, simulation is closed Into depth camera with reference to figure.
In a kind of implementation of the present embodiment, depth camera is obtained with reference to figure acquisition module 403 including speckle displacement Module 4031 and spot simulate to form module 4032.
Speckle displacement acquisition module 4031, for the albomaculatus coordinate of institute of the pattern according to diffraction optical element and mapping Relation, determines the position of whole spots in reference planes.
In the specific implementation, speckle displacement acquisition module includes spot coordinate obtaining module and speckle displacement search module.
Spot coordinate obtaining module, is closed for the albomaculatus coordinate of institute of the pattern according to diffraction optical element and mapping System, determines the coordinate of whole spots in reference planes.
Speckle displacement search module, in the specified range around the coordinate of each spot, searching for brightness maximum point Position as each spot position.Whole spots in reference planes are can determine that by using speckle displacement acquisition module Position.The way of search also can local block search, no longer describe in detail herein.
What deserves to be explained is, in actually photographed first with reference in figure, in being infinitely adjacent to brightness maximum point position Region unit (or block of pixels), similarly with certain brightness value.Simply the brightness value of the region unit (or block of pixels) is slightly lower. In the present embodiment, employ and the brightness value of brightness maximum point is set to 1 (as bright), by the bright of the slightly lower region unit of other brightness Angle value is set to 0 (being dark).By using brightness maximum point position as the center of spot, can more accurately locating spot, carry The accuracy with reference to figure that height simulation is obtained.
Spot simulates to form module 4032, calculates each in reference planes for being simulated using two-dimensional Gaussian function convolution The brightness of spot, simulates to form spot in the position of each spot.So as to form the depth camera reference required for the present embodiment Figure.
The embodiment of the present invention is used as first with reference to figure by camera device shooting, collecting speckle pattern, obtains first with reference to figure Mapping relations between the white point of the pattern of middle spot and diffraction optical element.And according to the mapping relations, analog synthesis depth Camera is with reference to figure.It effectively prevent miscellaneous problem of laser rescattering.And other external equipments need not be added, reduce equipment into This.Also, the problem of camera perspective shooting angle inside depth camera is small is also solved, entire depth phase can be disposably obtained Machine is with reference to figure.In addition, due also to the depth camera is smaller with reference to the capacity of figure, can more quickly obtain and store the depth phase Machine improves Consumer's Experience with reference to figure.
The embodiments of the present invention are for illustration only, and the quality of embodiment is not represented.
One of ordinary skill in the art will appreciate that realizing that all or part of step of above-described embodiment can be by hardware To complete, the hardware of correlation can also be instructed to complete by program, described program can be stored in a kind of computer-readable In storage medium, storage medium mentioned above can be read-only storage, disk or CD etc..
The foregoing is only presently preferred embodiments of the present invention, be not intended to limit the invention, it is all the present invention spirit and Within principle, any modification, equivalent substitution and improvements made etc. should be included in the scope of the protection.

Claims (10)

1. a kind of depth camera refers to picture capturing method, the depth camera include laser beam emitting device, diffraction optical element and Camera device, the laser beam emitting device is used to launch laser and the laser is projected into ginseng by the diffraction optical element Examine in plane, the camera device is used to shoot the speckle pattern that the laser is formed in the reference planes, and its feature exists In the pattern of the diffraction optical element includes multiple white points, and the laser is after each described white point respectively in the ginseng One spot of formation in plane is examined, methods described includes:
The speckle pattern is formed in the reference planes using the laser beam emitting device and the diffraction optical element;
Using speckle pattern described in the camera device shooting, collecting, first is obtained with reference to figure;
Determine described first with reference to spot in figure and the diffraction optical element pattern in the white point mapping relations;
According to the albomaculatus coordinate of institute of the pattern of the diffraction optical element and the mapping relations, the reference planes are determined On whole spots position;
The brightness for calculating each spot in the reference planes is simulated using two-dimensional Gaussian function convolution, in each spot Position simulate to form spot, obtain the depth camera with reference to figure.
2. the method as described in claim 1, it is characterised in that the determination described first is with reference to spot in figure and the diffraction The mapping relations of the white point in the pattern of optical element, including:
Described first is determined with reference to spot corresponding with the specified white point in the pattern of the diffraction optical element in figure;
In the same coordinate system, the coordinate of the specified white point and the corresponding spot of the specified white point is determined;
According to the specified white point and the coordinate of the corresponding spot of the specified white point, described first is determined with reference to spot in figure With the mapping relations of the white point.
3. method as claimed in claim 2, it is characterised in that the shape of the specified white point with addition to the indication white point The shape of other white points is different.
4. the method as described in claim 1, it is characterised in that the pattern according to the diffraction optical element it is all white The coordinate and the mapping relations of point, determine the position of whole spots in the reference planes, including:
According to the albomaculatus coordinate of institute of the pattern of the diffraction optical element and the mapping relations, the reference planes are determined On whole spots coordinate;
In the specified range around the coordinate of each spot, the position of search brightness maximum point is used as spot each described Position.
5. the method as described in claim 1, it is characterised in that the mapping relations include linear mapping relation or non-linear reflected Penetrate relation.
6. a kind of depth camera is with reference to figure acquisition device, the depth camera include laser beam emitting device, diffraction optical element and Camera device, the laser beam emitting device is used to launch laser and the laser is projected into ginseng by the diffraction optical element Examine in plane, the camera device is used to shoot the speckle pattern that the laser is formed in the reference planes, and its feature exists In the pattern of the diffraction optical element includes multiple white points, and the laser is after each described white point respectively in the ginseng One spot of formation in plane is examined, described device includes:
First with reference to figure acquisition module, for obtaining the speckle pattern formed described in the camera device shooting, collecting in reference planes Sample obtain first with reference to figure;
Determining module, for determining described first with reference to spot in figure and the white point in the pattern of the diffraction optical element Mapping relations;
Depth camera is with reference to figure acquisition module, for the albomaculatus coordinate of institute of the pattern according to the diffraction optical element and institute Mapping relations are stated, the position of whole spots in the reference planes is determined;Institute is calculated using the simulation of two-dimensional Gaussian function convolution The brightness of each spot in reference planes is stated, simulates to form spot in the position of each spot.
7. device as claimed in claim 6, it is characterised in that the determining module includes:
First sub- determining module, for determine described first with reference in figure with it is specified white in the pattern of the diffraction optical element The corresponding spot of point;
Second sub- determining module, in the same coordinate system, determining that the specified white point and the specified white point are corresponding The coordinate of spot;
3rd sub- determining module, for the coordinate according to the specified white point and the corresponding spot of the specified white point, it is determined that Described first with reference to spot in figure and the white point mapping relations.
8. device as claimed in claim 7, it is characterised in that the shape of the specified white point with addition to the indication white point The shape of other white points is different.
9. device as claimed in claim 6, it is characterised in that the depth camera includes with reference to figure acquisition module:
Spot coordinate obtaining module, for the albomaculatus coordinate of institute of the pattern according to the diffraction optical element and the mapping Relation, determines the coordinate of whole spots in the reference planes;
Speckle displacement search module, in the specified range around the coordinate of each spot, searching for brightness maximum point Position as spot each described position.
10. device as claimed in claim 6, it is characterised in that the mapping relations include linear mapping relation or non-linear Mapping relations.
CN201410334261.7A 2014-07-14 2014-07-14 Acquisition methods and device of a kind of depth camera with reference to figure Active CN105306922B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410334261.7A CN105306922B (en) 2014-07-14 2014-07-14 Acquisition methods and device of a kind of depth camera with reference to figure

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410334261.7A CN105306922B (en) 2014-07-14 2014-07-14 Acquisition methods and device of a kind of depth camera with reference to figure

Publications (2)

Publication Number Publication Date
CN105306922A CN105306922A (en) 2016-02-03
CN105306922B true CN105306922B (en) 2017-09-29

Family

ID=55203602

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410334261.7A Active CN105306922B (en) 2014-07-14 2014-07-14 Acquisition methods and device of a kind of depth camera with reference to figure

Country Status (1)

Country Link
CN (1) CN105306922B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106406002B (en) * 2016-10-28 2018-05-04 深圳奥比中光科技有限公司 Area array projection device and depth camera
CN106569330B (en) * 2016-10-28 2019-07-12 深圳奥比中光科技有限公司 A kind of design method of optical design, area array projection device and a kind of depth camera
CN108227232A (en) * 2016-12-14 2018-06-29 浙江舜宇智能光学技术有限公司 The diverging light formula speckle projector and its focus adjustment method and three-dimensional reconstruction system
CN107564051B (en) * 2017-09-05 2020-06-02 歌尔股份有限公司 Depth information acquisition method and system
EP3561574A4 (en) 2018-03-12 2019-12-25 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Laser projection module and detection method and apparatus therefor, and depth camera module and electronic apparatus
CN109167903B (en) * 2018-10-31 2021-01-15 Oppo广东移动通信有限公司 Image acquisition method, image acquisition device, structured light assembly and electronic device
CN109167905B (en) * 2018-10-31 2020-07-31 Oppo广东移动通信有限公司 Image acquisition method, image acquisition device, structured light assembly and electronic device
CN109167904B (en) * 2018-10-31 2020-04-28 Oppo广东移动通信有限公司 Image acquisition method, image acquisition device, structured light assembly and electronic device
CN110059537A (en) * 2019-02-27 2019-07-26 视缘(上海)智能科技有限公司 A kind of three-dimensional face data acquisition methods and device based on Kinect sensor
CN112752088B (en) * 2020-07-28 2023-03-28 腾讯科技(深圳)有限公司 Depth image generation method and device, reference image generation method and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101825431A (en) * 2009-03-05 2010-09-08 普莱姆森斯有限公司 Reference image techniques for three-dimensional sensing
CN102999910A (en) * 2012-11-27 2013-03-27 西安交通大学 Image depth calculating method
WO2013156530A1 (en) * 2012-04-18 2013-10-24 3Shape A/S 3d scanner using merged partial images
CN103796001A (en) * 2014-01-10 2014-05-14 深圳奥比中光科技有限公司 Method and device for synchronously acquiring depth information and color information

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101825431A (en) * 2009-03-05 2010-09-08 普莱姆森斯有限公司 Reference image techniques for three-dimensional sensing
WO2013156530A1 (en) * 2012-04-18 2013-10-24 3Shape A/S 3d scanner using merged partial images
CN102999910A (en) * 2012-11-27 2013-03-27 西安交通大学 Image depth calculating method
CN103796001A (en) * 2014-01-10 2014-05-14 深圳奥比中光科技有限公司 Method and device for synchronously acquiring depth information and color information

Also Published As

Publication number Publication date
CN105306922A (en) 2016-02-03

Similar Documents

Publication Publication Date Title
CN105306922B (en) Acquisition methods and device of a kind of depth camera with reference to figure
CN103765870B (en) Image processing apparatus, projector and projector system including image processing apparatus, image processing method
Falcao et al. Plane-based calibration of a projector-camera system
CN103649674B (en) Measuring equipment and messaging device
CN107505324B (en) 3D scanning device and scanning method based on binocular collaborative laser
CN107564069A (en) The determination method, apparatus and computer-readable recording medium of calibrating parameters
JP2008537190A (en) Generation of three-dimensional image of object by irradiating with infrared pattern
JP5633058B1 (en) 3D measuring apparatus and 3D measuring method
CN111161358B (en) Camera calibration method and device for structured light depth measurement
JP6800597B2 (en) Controls, control methods and programs
CN104111038B (en) The method utilizing the phase error of the phase place blending algorithm saturated generation of reparation
WO2011145285A1 (en) Image processing device, image processing method and program
JP6097903B2 (en) Three-dimensional shape acquisition apparatus, processing method, and program
WO2007037227A1 (en) Position information detection device, position information detection method, and position information detection program
CN107564051B (en) Depth information acquisition method and system
CN113298886A (en) Calibration method of projector
WO2021022775A1 (en) Depth image generation method, apparatus and device, and computer-readable storage medium
Draréni et al. Methods for geometrical video projector calibration
CN103876706B (en) Improvements in and relating to ophthalmoscopes
CN108036742A (en) New lines structural light three-dimensional method for sensing and device
JP2007315777A (en) Three-dimensional shape measurement system
CN109859313A (en) 3D point cloud data capture method, device, 3D data creation method and system
CN107063131B (en) A kind of time series correlation non-valid measurement point minimizing technology and system
US20160349045A1 (en) A method of measurement of linear dimensions of three-dimensional objects
JP6671589B2 (en) Three-dimensional measurement system, three-dimensional measurement method, and three-dimensional measurement program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant