CN107783353A - For catching the apparatus and system of stereopsis - Google Patents
For catching the apparatus and system of stereopsis Download PDFInfo
- Publication number
- CN107783353A CN107783353A CN201610737230.5A CN201610737230A CN107783353A CN 107783353 A CN107783353 A CN 107783353A CN 201610737230 A CN201610737230 A CN 201610737230A CN 107783353 A CN107783353 A CN 107783353A
- Authority
- CN
- China
- Prior art keywords
- source
- light
- stereopsis
- module
- flight time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/02—Illuminating scene
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
The present invention provides a kind of apparatus and system for being used to catch stereopsis.Device for catching stereopsis includes flight time image acquisition module and light emitting module, light emitting module produces structure light source and lighting source, flight time image acquisition module captures first and second reflection source for being projected to object by structure light source and lighting source and being formed, and then obtains the stereopsis information of object.By integrating flight time (TOF) and structured light technique, the present invention is able to obtain the overall volume with high depth precision, the stereopsis information of low noise, while reduction device and system and uses power.
Description
Technical field
The present invention relates to a kind of apparatus and system for catching image information, is used to catch stereopsis more particularly to one kind
Apparatus and system.
Background technology
Stereoscan technology is had been widely used in industrial circle and daily life, such as applied to stereoscan, monitoring
Identification and depth camera etc..Stereoscan technology be mainly used for detection and object analysis or environment geometrical construction and
Appearance data, and three-dimensional computing is carried out using detected data, and then emulate and establish the mathematical model of object and environment.
Existing stereoscan technology is included using time-of-flight method (Time of Flight, TOF), stereoscopic vision
The technology such as (Stereo Vision) or structure light (Structured Light) carries out the acquisition and computing of stereoscopic image data.
In foregoing stereoscan technology, time-of-flight method is by calculating light source projects in by determinand surface, then by determinand table
Face is reflected and reaches time of sensor to calculate the distance between determinand and image acquisition module, and then acquisition determinand
Stereoscopic image data.The advantages of time-of-flight method, is, compared to the structure light skill of stereovision technique and utilization range of triangle
Art, time-of-flight method can be realized using the low system of complexity.For example, flight time image acquisition module can include tight
The luminescence component set each other and the optical unit (such as lens) for focusing on reflected light are against, is surveyed without using triangle
Away from, i.e. without the concern for the distance between flight time image acquisition module and structure light source (baseline (Baseline)), and
The angle that structure light source is reflected between the reflection light of gained and baseline through object.In addition, flight time image acquisition module obtains
Complete image information is captured under (projection-shooting) to be shot in single, therefore suitable for applying (Real-time in real time
Application).However, the depth noise (Depth Noise) for the image information that time-of-flight method is obtained is larger, and institute
The precision of caused image data is relatively low.
On the other hand, structured light technique is by capturing determinand to determinand projection special pattern, then by sensor
The 3-D view on surface, finally calculated using principle of triangulation and obtain the three-dimensional coordinate of determinand.Though structured light technique can
Relatively low depth noise and higher image data precision are obtained, three-dimensional point cloud (3D Cloud are carried out by structured light technique
Point calculating) is relatively time-consuming.
Therefore, still a need to a solution is provided, to combine both time-of-flight method and structure light the advantages of,
And then offer one kind can provide high depth precision, and it is for catch stereopsis with smaller size smaller, compared with low power consumption
System.
The content of the invention
In order to solve the above-mentioned technical problem, according to the wherein embodiment of the present invention, there is provided one kind is used to catch three-dimensional shadow
The device of picture, it includes a flight time image acquisition module and a light emitting module.Flight time image acquisition module is used for
Capture a stereopsis information of an at least object.Light emitting module is described luminous adjacent to the flight time image acquisition module
Module produces the structure light source and a lighting source for investing the object.The structure light source passes through the anti-of the object
Penetrate, to form one first reflection source.The lighting source is by the reflection of the object, to form one second reflection source.
The flight time image acquisition module captures first reflection source and second reflection source, to obtain the thing
The stereopsis information of body.
Further, the light emitting module includes a smooth generation unit, an x-ray diffraction component and a Photodiffusion component,
The structure light source caused by the light emitting module is formed after the conversion by the x-ray diffraction component, and described luminous
The lighting source caused by module is formed after the conversion by the Photodiffusion component.
Further, the smooth generation unit includes the laser generator and a light for being used to produce a LASER Light Source
Component is learned, the LASER Light Source is sequentially by the optical module and the x-ray diffraction component, to form the structure light source,
And the LASER Light Source is sequentially by the optical module and the Photodiffusion component, to form the lighting source.
Further, the laser generator has an at least collimator, and the optical module includes a spectrum groupware
And a reflecting assembly, the LASER Light Source sequentially pass through the collimating of at least one collimator, the light splitting of the spectrum groupware
And the conversion of the x-ray diffraction component, to form the structure light source, and the LASER Light Source is sequentially by least described in one
The collimating of collimator, being divided of the spectrum groupware, the reflection of the reflecting assembly and the conversion of the Photodiffusion component, with
Form the lighting source.
Further, the smooth generation unit includes a laser generator and a luminescence component, and the laser produces
A LASER Light Source caused by device is by the x-ray diffraction component to be converted to the structure light source, and the luminescence component is produced
A raw projection source is by the Photodiffusion component to be converted to the lighting source.
Further, the flight time image acquisition module, the x-ray diffraction component and the Photodiffusion component
It is linear to set, and an exiting surface of a light receiving surface of the flight time image acquisition module, the x-ray diffraction component with
And an exiting surface of the Photodiffusion component arranges all along same datum axis.
Further, the x-ray diffraction component and the Photodiffusion component are arranged on the flight time image capture
The same side of module, and the Photodiffusion component be arranged at the x-ray diffraction component and the flight time image acquisition module it
Between.
Further, the x-ray diffraction component and the Photodiffusion component are separately positioned on the flight time image
The both sides of acquisition module.
Further, the flight time image sensing module also includes a handover module, and the handover module includes
One infrared light band logical filter, a visible ray band logical filter;An and light shifter.The light shifter is used for anti-by described first
Both light source and second reflection source is penetrated to be directed to the infrared light band logical filter or an environment light source is irradiated in into institute
State reflected light caused by object and be directed to the visible ray band logical filter.When first reflection source and described second anti-
Penetrate light source by the switching of the light shifter to be directed to the infrared light band logical filter when, first reflection source and
Second reflection source is by the infrared light band logical filter, to produce a black and white stereopsis information;When the ambient light
Source is irradiated in the reflected light caused by the object by the switching of the light shifter to be directed to the visible band
During logical filter, the environment light source is irradiated in the reflected light caused by the object by the visible ray band logical filter,
To obtain a chromatic image information.
According to another embodiment of the present invention, there is provided the system for catching stereopsis, it is used to capture an at least thing
One image information of body, the system include a processor module, are electrically connected at a flight time of the processor module
Processing module, a light emitting module, and it is electrically connected at a flight time image capture mould of the flight time processing module
Block.The light emitting module is used for the lighting source for producing the structure light source for investing the object and investing the object,
And the structure light source by the reflection of the object to produce a structure optical information, and the lighting source passes through the object
Reflection, with produce one illumination optical information.The flight time image acquisition module be used for capture the structure optical information and
The illumination optical information.The structure optical information that the flight time image acquisition module captures passes through the processor
The three-dimensional point cloud of a structure light is can obtain after module arithmetic.The illumination light that the flight time image acquisition module captures
Information is by can obtain a flight time three-dimensional point cloud after the processor module computing.The three-dimensional point cloud of the structure light and institute
A three-dimensional depth figure for being used to provide the image information of the object can be merged into by stating flight time three-dimensional point cloud.
Further, the light emitting module includes a smooth generation unit, an x-ray diffraction component and a Photodiffusion component,
The structure light source caused by the light emitting module is formed after the conversion by the x-ray diffraction component, and described luminous
The lighting source caused by module is formed after the conversion by the x-ray diffraction component.
Further, a memory cell is also included for catching the system of stereopsis, wherein, the structure light is stood
Body point cloud, the flight time three-dimensional point cloud and the three-dimensional depth figure are stored in the memory cell.
Further, a computing unit is also included for catching the system of stereopsis, it is electrically connected at the place
Device module is managed, for being corrected to the three-dimensional depth figure.
The technical way of the present invention is, by producing structure light source and illumination light in single image capturing program
Source, and obtained using flight time image acquisition module with capturing by structure light source and lighting source through the reflection of object under test surface
The first reflection source and the second reflection source, the advantages of both time-of-flight method and structured light technique can be combined, obtain
There must be the stereopsis information of low depth noise (Depth Noise) and the object under test of high depth precision.It is in addition, of the invention
The device for being used to catch stereopsis and system there is less overall volume, and its to operate the consumption of the energy relatively low.
For the enabled feature and technology contents for being further understood that the present invention, refer to below in connection with the present invention specifically
Bright and accompanying drawing, however the accompanying drawing provided be merely provided for refer to explanation, not be used for the present invention is any limitation as.
Brief description of the drawings
The schematic diagram for being used to catch the device of stereopsis that Fig. 1 is provided by first embodiment of the invention;
The schematic diagram for being used to catch the device of stereopsis that Fig. 2 is provided by second embodiment of the invention;
The schematic diagram for being used to catch the device of stereopsis that Fig. 3 is provided by third embodiment of the invention;
Fig. 4 is the flow chart provided by the present invention for being used to catch the method for stereopsis;
Fig. 5 A and Fig. 5 B are the schematic diagram provided by the present invention for being used to catch the system of stereopsis;And
Fig. 6 is the operational flowchart provided by the present invention for being used to catch the system of stereopsis.
Embodiment
Be below illustrated by specific instantiation it is presently disclosed about " being used to catch the dress of stereopsis
Put, method and system " embodiment, those skilled in the art can understand the present invention's by content disclosed in this specification
Advantage and technique effect.The present invention can be implemented or applied by other different specific embodiments, each in this specification
Item details may be based on different viewpoints and application, in the lower various modifications of progress without departing from the spirit and change.In addition, this
The accompanying drawing of invention is only simple schematically illustrate, not according to the description of actual size, first give chat it is bright.Following embodiment will enter one
Step describes the correlation technique content of the present invention, but disclosure of that and the technology category for being not used to the limitation present invention in detail.
First embodiment
First, Fig. 1 is referred to.The device for being used to catch stereopsis that Fig. 1 is provided by first embodiment of the invention
Schematic diagram.First embodiment of the invention be used for catch stereopsis device D include flight time image acquisition module 1 and
Light emitting module 2.Flight time image acquisition module 1 comprises at least camera lens module 13 and flight time sensing unit 14.Flight
Time image acquisition module 1 is for capturing at least stereopsis information of an object O.Light emitting module 2 is adjacent to flight time shadow
As acquisition module 1.Setting distance between light emitting module 2 and flight time image acquisition module 1 is according to the image to be realized
Resolution ratio, and for catching the distance between the device D of stereopsis and object O to be measured depending on, detailed content is later
Explanation.
Hold above-mentioned, light emitting module 2 includes light generation unit 21, x-ray diffraction component 22 and Photodiffusion component 23.Luminous mould
Block 2 is to include photogenerator for producing the structure light source SL for investing object O and lighting source IL light generation unit 21, also may be used
Further include other optical modules, such as lens.The species of the photogenerator of light generation unit 21 is not subject to the limits fixed.Light produces
Unit 21 is produced with (Coherent Light) is dimmed, therefore, as long as above-mentioned purpose can be realized, the inside group of light generation unit 21
The selection and setting of part can be adjusted.For example, the photogenerator of light generation unit 21 is laser generator 211, is used
To produce LASER Light Source L11.For example, the photogenerator for producing infrared light can be used.In addition, it is possible to use others are luminous
Photogenerator of the component as light generation unit 21, such as light emitting diode (Light emitting diode, LED), then take
Make the light as caused by light emitting diode with " narrow " bandpass filter (" Narrow Bandwidth " Band Pass Filter)
Source is converted to light modulation.Preferably, light generation unit 21 includes laser generator 211, consequently, it is possible to can be without using band logical
The light source with single wavelength is directly produced in the case of wave filter, and is enough to provide that to carry out stereopsis information extraction enough
Energy.In the present embodiment, light generation unit includes laser generator 211.In addition, when using laser generator 211, laser
Generator 211 can have an at least collimator (Collimator) 2111.
As described above, x-ray diffraction component 22 and Photodiffusion component 23 are respectively used to as caused by light generation unit 21
Structure light source SL and lighting source IL is converted to light modulation.In other words, structure light source SL is as caused by light emitting module 2
Formed after conversion by x-ray diffraction component 22, and lighting source IL is to pass through Photodiffusion component caused by light emitting module 2
Formed after 23 conversion.In addition, can be further between x-ray diffraction component 22 and Photodiffusion component 23 and light generation unit 21
Comprising other optical modules, for example, can be included between x-ray diffraction component 22 and Photodiffusion component 23 and light generation unit
Lens, such as condenser lens.
X-ray diffraction component 22 is also known as diffractive optical element (Diffractive Optical Element, DOE), and it can
For hologram, grating, or other suitable optical modules.X-ray diffraction component 22 by the micro-structural on its surface utilizing light
Source produces two-dimensional encoded.In the embodiment of the present invention, x-ray diffraction component 22 will be changed caused by light generation unit 21 with light modulation
It is the structure light source SL for being projeced into object O (or the environment being made up of multiple objects O) surface.Photodiffusion component 23 is used for will
With light modulation average diffusion caused by light generation unit 21, so as to produce the lighting source IL for being projeced into object O surfaces to be measured.
To obtain object under test O or environment complete stereopsis information, Photodiffusion component 23 is by caused by light generation unit 21
The lighting source IL for being able to completely cover object under test O or environment is converted to light modulation.
It is noted that because light generation unit 21 used in the embodiment of the present invention has the relatively low structure of complexity
Composition, it is that have less size and volume and can be applied to small-sized electronic product, such as mobile phone etc..In addition,
Light generation unit 21 must only control the light of the generation of laser generator 211 pulsed therein by simple electronic control system
Source signal, the embodiment of the present invention do not need to use the panel of complexity and set peripheral circuit to control light generation unit 21
The pattern of light extraction, therefore, in addition to it can reduce the volume of itself, it can also reduce processing procedure and production cost.
In the first embodiment, light generation unit 21 is to include single photogenerator (laser generator 211).Therefore, it is
LASER Light Source L1 caused by laser generator 211 is respectively converted into by x-ray diffraction component 22 and Photodiffusion component 23
Structure light source SL and lighting source IL, light generation unit 21 further comprise optical module 212, to enter to LASER Light Source L1
Row light splitting.In other words, as shown in figure 1, light generation unit 21 includes laser generator 211 and optical module 212, laser production
LASER Light Source L11 caused by raw device 211 is sequentially by optical module 212 and x-ray diffraction component 22, to form structure light source SL,
And LASER Light Source L11 is sequentially by optical module 212 and Photodiffusion component 23, to form lighting source IL.
Specifically, optical module 212 includes spectrum groupware 2121 and reflecting assembly 2122.For example, light splitting group
Part 2121 is half-reflecting mirror (Half Mirror), and it has specific reflectivity/penetrance ratio, for LASER Light Source L11
It is divided.Reflecting assembly 2122 is a leaded light component, such as speculum.After LASER Light Source L11 is divided, a part of light
Beam forms structure light source SL by x-ray diffraction component 22, and the light beam of another part is led by the reflection of reflecting assembly 2122
Photodiffusion component 23 is led to, and then produces lighting source IL.After structure light source SL is projeced into object O surface, by object O's
The reflection on surface and form the first reflection source R1, and lighting source IL is projeced into object O surface, and by object O table
The reflection in face and form the second reflection source R2.
Same to refer to Fig. 1, flight time image acquisition module 1 captures the first reflection source R1 and the second reflection source
R2, to obtain object O stereopsis information.In other words, by using the camera lens mould in flight time image acquisition module 1
Block 13 captures the first reflection source R1 as caused by structure light source SL, and object O solid can be obtained by space modulation
Image information, and captured by using the camera lens module 13 of flight time image acquisition module 1 as caused by lighting source IL
Second reflection source R2, object O stereopsis information can be obtained with passage time modulation.
It will be detailed below flight time image acquisition module 1 in first embodiment of the invention and light emitting module 2
Set-up mode.In the present invention, the arrangement mode of flight time module 1 and light emitting module 2 is vertical depending on what is obtained
Resolution ratio, the precision of body image information, and for catching the distance between the device D of stereopsis and object O to be measured
Depending on.Specifically, when carrying out the acquisition of stereopsis information using structured light technique, must be counted using triangle telemetry
Calculate.Triangle telemetry, which includes, passes through known parameters, i.e. flight time module 1 and the lateral separation of structure light source SL launch points are (i.e.
Baseline in triangle telemetry, the baseline BL of x directions shown in Fig. 1), and between the first reflection source R1 and baseline BL
Angle (the range of triangle angle θ shown in Fig. 1), to calculate depth distance (z directions shown in Fig. 1).Therefore, as shown in figure 1,
In the present invention, flight time image acquisition module 1, x-ray diffraction component 22 and Photodiffusion component 23 are linearly set, and are flown
The light extraction of the light receiving surface 11 of row time image acquisition module 1, the exiting surface 221 of x-ray diffraction component 22 and Photodiffusion component 23
Face 231 arranges all along same datum axis BA.
Hold above-mentioned, the consideration based on range of triangle, it is however generally that, x-ray diffraction component 22 and flight time image acquisition module
The distance between 1 is more remote, and the precision of the three-dimensional shadow information thus obtained is higher.For example, using Gray code (Gray
Code) during algorithm, for 5Mega Pixel flight time image acquisition module 1, when x-ray diffraction component 22 and flight
Between the distance between image acquisition module 1 (baseline BL) be 14 centimeters, and range of triangle angle θ is 20 degree, at 20 centimeters of depth
The resolution ratio of 0.1 millimeter (mm) can be obtained.In addition, using phase shift (Phase Shift) algorithm, or use hot spot
(Speckle) when algorithm adds phase shift algorithm, according to bright dark dynamic contrast degree, resolution ratio can be lifted further.However, with
The increase of the distance between x-ray diffraction component 22 and flight time image acquisition module 1, for catching the device D of stereopsis
Overall volume it is also bigger.On the other hand, due to the conversion by Photodiffusion component 23 and caused lighting source IL be utilize
The principle of time modulation enters the calculating of row distance, in the present invention, Photodiffusion component 23 and flight time image acquisition module 1 it
Between distance it is not subject to the limits fixed.In other words, as long as so that being projeced into object O tables by the exiting surface 231 of Photodiffusion component 23
The lighting source IL in face scope is enough the visible area for covering flight time image acquisition module 1, Photodiffusion component 23 and flight
The distance between time image acquisition module 1 can be adjusted according to other specification.
In summary, the device D for being used to catch stereopsis that first embodiment of the invention is provided, can pass through altogether two
Secondary throwing-bat (projection-shooting) program, i.e., by the projective structure light source SL of x-ray diffraction component 22, by flight time image capture
Module 1 captures the first reflection source R1 as caused by structure light source SL, and passes through the projection illumination light source of Photodiffusion component 23
IL, then the second reflection source R2 as caused by lighting source IL is captured as flight time image acquisition module 1, to be distinguished
Pass through stereopsis information obtained by space modulation and time modulation.In addition, the foregoing order for throwing-clapping program twice can phase
Instead, present invention system not subject to the limits.
By integrate two kinds of different stereoscan technologies, can combine these stereoscan technologies each the advantages of, i.e. utilize
Structure light carry out closely, high-precision stereopsis information extraction, and carry out with time-of-flight method the solid of middle-range and long distance
Scanning, while the depth accuracy (Depth Accuracy) when low coverage stereoscan is carried out with structure light has been improved since.
Fig. 1 equally is referred to, the flight time image sensing module 1 of first embodiment of the invention also includes handover module 12.
Handover module 12 is for carrying out patch color (Color Texturing) to the stereopsis information obtained.Specifically, switch
Module 12 includes infrared light band logical (Bandpass) filter, visible ray band logical filter, and light shifter.Light shifter is used for will
First reflection source R1 and the second reflection source R2 are directed to infrared light band logical filter, or environment light source is directed into visible ray
Band logical filter.Light shifter can be, for example, piezo-electric motor (Piezoelectric Motor), voice coil motor (Voice
Coil Motor, VCM) or electromagnetic switch.
Consequently, it is possible to the property according to the band logical filter that light source passed through received by camera lens module 14, can be obtained
The image information of different kenels.For example, under infrared optical mode, the first reflection source R1 and the second reflection source R2 lead to
Cross the switching of light shifter and be led to infrared light band logical filter, it is red in the first reflection source R1 and the second reflection source R2
Wave band outside outer light can be filtered, and respectively by space modulation and time modulation algorithm by 2D structure light source SL and photograph
Mingguang City source IL reflected images are converted to object O black and white stereopsis information by computing.In addition, under visible mode, ring
Environmental light source exposes to reflected light caused by object O by the switching of light shifter to be led to visible ray band logical filter, instead
Penetrate the wave band in light outside visible ray to be filtered, and obtain object O chromatic image information.Specifically, in visible optical mode
Under formula, the light emitting module 2 for producing structure light source SL and lighting source IL can be off state, and light shifter is by environment
Light, such as extraneous natural light (sunshine) or indoor light source (such as light source of electric light) expose to reflected light caused by object O
Visible ray band logical filter is directed to, and is filtered the wave band outside visible ray in reflected light by visible ray band logical filter, and is reached
To the effect for the chromatic image information for capturing object O.In other words, in visible mode, it is not necessary to entered by light emitting module
The action of line light source projection.
In other words, (structure light source SL is passed through respectively carrying out foregoing secondary throwing-bat (projection-shooting) program altogether
And lighting source IL and realize) after, by the switch in handover module 12 environment light source can be made to expose to object O institutes
Caused reflected light is led to visible ray band logical filter, completes the color camera program of third time whereby.Foregoing third time
Photographing program can obtain object O chromatic image information, therefore, by the integration and computing of data, can by it is preceding throw twice-clap journey
The black and white stereopsis information that sequence is obtained, believed by the information obtained by third time photographing program to be converted to color solid image
Breath.
In summary, can be right by further installing handover module 12 additional in the structure of flight time image sensing module 1
Black and white stereopsis information carries out patch color, and because the flight time image sensing module 1 for including handover module 12 can be simultaneously
As flight time video camera (TOF Camera) and colour camera (Color Camera), this two kinds of video cameras have phase
With pixel and common world coordinates, accordingly, can reach in the case of being not necessary to carry out additional corrections program quick patch color with
And the effect of seamless paste color, while eliminate the need for the necessity of extra colour camera.
Second embodiment
Refer to Fig. 2.The signal for being used to catch the device of stereopsis that Fig. 2 is provided by second embodiment of the invention
Figure.Maximum difference is that the design of light emitting module 2 is different between second embodiment of the invention and first embodiment.Below by pin
The difference of second embodiment and first embodiment is illustrated, and second embodiment is substantially the same with first embodiment
Part will not be described in great detail.
As shown in Fig. 2 in second embodiment, light generation unit 21 includes laser generator 211 and luminescence component 213,
LASER Light Source L11 caused by LASER Light Source 211 is by x-ray diffraction component 22 to be converted to structure light source SL, and luminescence component
Projection source L12 caused by 213 is by Photodiffusion component 23 to be converted to lighting source IL.In other words, implement with first
Example unlike, second embodiment be using two independent photogenerators come produce respectively be used for produce structure light source SL and
Lighting source IL LASER Light Source L11 and projection source L12.
Specifically, because structure light source SL must as produced by same dim be produced from laser in second embodiment
Device 211, and collimator 2111 of arranging in pairs or groups produces LASER Light Source L11.On the other hand, for producing lighting source IL luminescence component
231 species then system not subject to the limits.For example, luminescence component 231 can be LED luminescence components.
Fig. 2 is please equally referred to, x-ray diffraction component 22 and Photodiffusion component 23 are arranged on flight time image acquisition module 1
The same side, and Photodiffusion component 22 is disposed between x-ray diffraction component 23 and flight time image acquisition module 1.Such as pin
To described in first embodiment, flight time image acquisition module 1, x-ray diffraction component 22 and Photodiffusion component 23 are linearly set
Put, and the light receiving surface 111 of flight time image acquisition module 11, the exiting surface 221 of x-ray diffraction component 22 and light diffusion group
The exiting surface 231 of part 23 arranges all along same datum axis BA, and flight time image acquisition module 11 and x-ray diffraction component
222 distance in x directions (horizontal direction) is the baseline BL (Baseline) in triangle telemetry.
3rd embodiment
Refer to Fig. 3.The signal for being used to catch the device of stereopsis that Fig. 3 is provided by third embodiment of the invention
Figure.Set-up mode of the maximum difference in each component in light emitting module 2 between third embodiment of the invention and second embodiment
It is different.Illustrated below for the difference of 3rd embodiment and second embodiment, and 3rd embodiment is real with second
Substantially the same part of example is applied to will not be described in great detail.
As shown in figure 3, light generation unit 21 includes laser generator 211 and luminescence component 213, the institute of LASER Light Source 211
Caused LASER Light Source L11, to be converted to structure light source SL, and is projected by x-ray diffraction component 22 caused by luminescence component 213
Light source L12 is by Photodiffusion component 23 to be converted to lighting source IL.In addition, x-ray diffraction component 22 and Photodiffusion component 23 divide
The both sides of flight time image acquisition module 1 are not arranged on.In other words, x-ray diffraction component 22 and Photodiffusion component 23 are distinguished
The two-phase for being arranged on flight time image acquisition module 1 is tossed about.It is noted that 3rd embodiment and foregoing second implementation
Relative position in example between x-ray diffraction component 22, Photodiffusion component 23 and flight time image acquisition module 1 can be according to institute
The precision for the stereopsis information to be obtained, and manufacture for catch stereopsis device D during other specification
Adjusted.
In addition, the present invention also provides a kind of method for catching stereopsis.Referring to Fig. 4, Fig. 4 is carried by the present invention
What is supplied is used to catch the flow chart of the method for stereopsis.The method provided by the present invention for being used to catch stereopsis includes down
Row step:The structure light source and a lighting source (step S100) for investing an object are produced by a light emitting module;It is described
Structure light source and the radiation source are respectively by the reflection of the object, to form one first reflection source and one respectively
Second reflection source (step S102);And by a flight time image acquisition module capture first reflection source and
Second reflection source, to obtain a stereopsis information (step S104) of the object.
Please refer to Fig. 1 to 3.First, in step S100, the structure light for investing object O is produced by light emitting module 2
Source SL and lighting source IL.The light emitting module 2 in aforementioned first embodiment to 3rd embodiment can be used in light emitting module 2, its
Detailed content does not describe again herein.
Next, in step S102, structure light source SL and lighting source IL is reflected using object O surface, is divided whereby
The first reflection source R1 and the second reflection source R2 is not formed.Finally, in step S104, flight time image capture is passed through
Module 1 captures the first reflection source R1 and the second reflection source R2, to obtain object O stereopsis information.In addition,
The method provided by the present invention for being used to catch stereopsis also can further include the step for three-dimensional image information paste color
Suddenly.Light shifter 12 three-dimensional image information is carried out paste color by way of with the substantial phase of person described in first embodiment
Together.
Specifically, flight time image acquisition module 1 can further include a handover module 12, the handover module 12
Comprising infrared light band logical filter, visible ray band logical filter and light shifter, and light shifter is used for the first reflection source R1
Infrared light band logical filter is directed to the second reflection source R2, or environment light source is exposed into reflected light caused by object O
It is directed to visible ray band logical filter.Under infrared optical mode, both the first reflection source R1 and the second reflection source R2 pass through
To be directed to infrared light band logical filter, the first reflection source R1 and the second reflection source R2 pass through infrared for the switching of light shifter
Light belt leads to filter and reaches flight time sensing component 14 again, to produce black and white stereopsis information.Under visible mode, environment
Light source exposes to reflected light caused by object O by the switching of light shifter to be directed to visible ray band logical filter, reflected light
Flight time sensing component 14 is reached by visible ray band logical filter again, to obtain chromatic image information.Stood using foregoing black and white
Body image information and chromatic image information carry out computing and processing, can obtain color solid image information.
The present invention additionally provides a kind of system for catching stereopsis, and it is vertical that it is used for acquisition at least the one of an object O
Body image information.Fig. 5 A and Fig. 5 B are refer to, and are coordinated simultaneously shown in Fig. 6.Fig. 5 A and Fig. 5 B are used for be provided by the present invention
The schematic diagram of the system of stereopsis is caught, and Fig. 6 is the operation provided by the present invention for being used to catch the system of stereopsis
Flow chart.The system S provided by the present invention for being used to catch stereopsis includes memory cell 51, processor module 52, flown
Row time-triggered protocol module 53, light emitting module 58 and flight time image acquisition module 56.
First, Fig. 5 A are referred to, while are coordinated shown in Fig. 1.Processor module 52 is electrically connected at memory cell 51;Fly
Row time-triggered protocol module 53 is electrically connected at processor module 52;Light emitting module 58 is electrically connected at flight time processing module
53;And flight time image acquisition module 56 is electrically connected at flight time processing module 53.For example, light emitting module 58 is right
Should be in the light emitting module 2 in first embodiment.Flight time image acquisition module 56 can correspond in first embodiment fly
Row time image acquisition module 1.
As shown in Figure 5A, light emitting module 58 can be by the control of flight time processing module 53, for producing throwing respectively
The lighting source IL of structure light source SL and trend of purchasing object O to object O.Structure light source SL and lighting source IL pass through respectively
Object O reflection and produce respectively a structure optical information and one illumination optical information.Flight time image acquisition module 56 passes through
The control of flight time processing module 53, to capture structure optical information and illumination optical information.
Hold the fortune above-mentioned, the structure optical information that flight time image acquisition module 56 captures passes through processor module 52
After calculation, the object O three-dimensional point cloud of a structure light can be obtained, and the illumination light that flight time image acquisition module 56 captures
After computing of the information by processor module 52, a flight time three-dimensional point cloud can be obtained, and the three-dimensional point cloud of structure light and
Flight time, three-dimensional point cloud can be merged into a three-dimensional depth figure by computing.For example, the structure light of processor module 52
Solid point cloud and flight time three-dimensional point cloud to merge into by the computing of microprocessor module 52 by that can be used to provide thing
The three-dimensional depth figure of body O image information.However, present invention system not subject to the limits.
Specifically, flight time processing module 53 is to be used for lighting source IL caused by modulation light emitting module 58 (i.e.,
Modulating signal is given to carry out time modulation), while according to the lighting source IL's for being previously used for modulation light emitting module 58
Mode, for being demodulated change (De-modulate) to the sensor of flight time image acquisition module 56.By flight time shadow
The information and the information as obtained by 52 computings of processor module captured as acquisition module 56, includes structure light solid point
Cloud, flight time three-dimensional point cloud and other information, can all be stored in memory cell 51.In other words, by the flight time
The information that image acquisition module 56 is captured can be first temporarily stored in memory cell 51, then is sent processor module 52 back to and entered
Row processing and computing.In addition, can further include computing unit 57 for the system S for catching stereopsis, it is electrically connected with
In micro treatment module 52, for being corrected to three-dimensional depth map.Processor module 52 is also to flight time image acquisition module
56 and light emitting module 58 carry out overall control, to avoid occurring between these components interaction sequential interference.
Next, Fig. 5 B are referred to, while refering to Fig. 2 and Fig. 3 content.As shown in Figure 5 B, for catching stereopsis
System S include memory cell 51, processor module 52, flight time processing module 53, the first light emitting module 54, second hair
Optical module 55 and flight time image acquisition module 56.Processor module 52 is electrically connected at memory cell 51;During flight
Between processing module 53 be electrically connected at processor module 52;First light emitting module 54 is electrically connected at processor module 52;Second
Light emitting module 55 is electrically connected at flight time processing module 53;And flight time image acquisition module 56 is electrically connected at flight
Time-triggered protocol module 53.First light emitting module 54 is by the control of processor module 52, for producing the structure for investing object O
Light source SL.Structure light source SL produces a structure optical information by object O reflection.Second light emitting module 55 passes through the flight time
The control of processing module 53, for producing the lighting source IL for investing object O.Lighting source IL by object O reflection and
Produce an illumination optical information.Flight time image acquisition module 56 is tied by the control of flight time processing module 53 with capturing
Structure optical information and illumination optical information.
For example, the first light emitting module 54 and the second light emitting module 55 are respectively corresponding to the laser shown in Fig. 2 and Fig. 3
Generator 211 and luminescence component 213.Flight time image acquisition module 56 can correspond to first embodiment to the 3rd implementation
Flight time image acquisition module 1 in example.In other words, the system for being used to catch stereopsis shown by Fig. 5 B can use
Fig. 2 and Fig. 3 device for being used to catch stereopsis.It is used to catch between the system of stereopsis shown by Fig. 5 B and Fig. 5 A
Difference be, in shown by Fig. 5 B be used for catch in the system of stereopsis, the first light emitting module 54 can directly lead to
Cross processor module 52 and carry out switch switching (On/Off switchings).In other words, the first light emitting module 54 need not pass through flight
Time-triggered protocol module 53 carries out time modulation.Fig. 5 B other detailed contents and Fig. 5 A are substantially the same, do not describe again herein.
Referring to Fig. 6, simultaneously according to need referring to figs. 1 to Fig. 3, Fig. 5 A and Fig. 5 B content.Coordinate earlier figures 5A and Fig. 5 B institutes
The system for catching stereopsis of display, the program of stereopsis information extraction can be carried out.As shown in fig. 6, first, black
The system S for catching stereopsis is operated under white mode, to obtain a black-and-white image information (step S200).In other words,
Step S200 is that the system S for catching stereopsis is operated under infrared light (IR) pattern.Included in step S200 and pass through knot
Structure light obtains the program (step S2001, S2003, S2005) of the three-dimensional point cloud of structure light, and is flown by time-of-flight method
The program (step S2002, S2004, S2006) of row time three-dimensional point cloud.The order that both of the aforesaid program is carried out can be on demand
Adjusted.
For example, it can first produce and invest at least structure light source SL (step S2001) of an object O, i.e. light can be passed through
The combination (being contained in the light emitting module 54 of light emitting module 58 or first) of generation unit 21 and x-ray diffraction component 22 produces structure
Light source SL.Structure light source SL reflects to form the first reflection source R1 through object O, and flight time image acquisition module 56 captures first
Reflection source R1, to produce structure optical information (step S2003).Next, flight time image acquisition module 56 captures
Structure optical information by the computing of processor module 52, to obtain the three-dimensional point cloud (step S2005) of structure light.So far, it is complete
Into the program for the stereopsis that object O is captured with structure light.
At least radiation source LL (step S2002) of an object O is invested next, producing.That is, light generation unit can be passed through
21 and combination (being contained in the light emitting module 55 of light emitting module 58 or second) the generation lighting source IL of Photodiffusion component 23.According to
Mingguang City source IL reflects to form the second reflection source R2 through object O, and flight time image acquisition module 56 captures the second reflection source
R2, to produce illumination optical information (step S2004).Next, the illumination light that flight time image acquisition module 56 captures
Information puts cloud (step S2006) by the computing of processor module 52 to obtain flight time solid.So far, completed to fly
Row Time Method captures the program of object O stereopsis.
Then, the three-dimensional point cloud of structure light and flight time three-dimensional point computing of the cloud by processor module 52, to merge
Into a black and white three-dimensional depth figure (S2007).In this step, the information that closely stereoscan obtained will be carried out with structure light
And the information obtained with time-of-flight method progress middle-range and long distance stereoscan is integrated.Consequently, it is possible to it can utilize
Structure light obtains closely lower high-precision stereopsis information extraction, and carries out standing for middle-range and long distance with time-of-flight method
Swept-volume, while depth accuracy when low coverage stereoscan is carried out with structure light has been improved since.
Hold it is above-mentioned, due to operating the system S for catching stereopsis under infrared optical mode, can only obtain black and white solid
Depth map, step S202 and S204 can be carried out after abovementioned steps S200 is carried out, i.e., be operated under color mode for catching
The system S of stereopsis, to obtain a chromatic image information, and based on pixel (Pixel) information in chromatic image information
Black and white three-dimensional depth figure is painted, to obtain color solid depth map.It is noted that due to the picture in chromatic image information
Pixel is completely the same between prime information and the foregoing information using obtained by structure light source SL or lighting source IL, it is not necessary to carries out again
External calibration can paint to black and white three-dimensional depth figure in high precision.Step S202 can be by using Fig. 1 to described in Fig. 3
Handover module 12 and complete.Specifically, step S202 includes is projeced into object O, via flight time shadow with environment light source
As acquisition module 56 captures the chromatic reflected light reflected by object O surface, is led using the light shifter in handover module 12
Draw chromatic reflected light to visible ray band logical filter, and obtain object O chromatic image information.Finally, in step S204, foundation
Chromatic image information obtained in step S202 is painted to the black and white three-dimensional depth figure obtained in step S200, to obtain coloured silk
Colour solid depth map.
Beneficial effects of the present invention
In summary, the beneficial effects of the present invention are be used for by integrated structure light with flying time technology in single
Among the device or system of Three-dimensional image acquiring, the present invention is able to obtain the stereopsis letter with high depth precision, low noise
Breath, while the overall volume of reduction device and system and use power.
Specifically, it is to be carried out closely, in high precision using structure light in device provided by the present invention, method and system
Stereopsis information extraction, and carry out with time-of-flight method the stereoscan of middle-range and long distance.Consequently, it is possible to it can be effectively increased
The scope of stereoscan, and then lift the application of product.Further, since the small volume of the apparatus and system of the present invention,
Structure complexity is low, goes in less electronic product, and can be operated under conditions of the less energy is provided.In detail
, can be by simple component in apparatus and system provided by the present invention for thin, i.e. the laser module 211 of small size is arranged in pairs or groups standard
Straight instrument 2111 and x-ray diffraction module 22 (the two is all the lens of small size) produce structure light source SL, wherein laser module
211 need only provide pulsed signal is controlled without the control module of complexity, and manufacturing cost can be greatly reduced.Furthermore
Using time-of-flight method can come improve with structure light carry out low coverage stereoscan when depth accuracy (Depth Accuracy).
In addition, device provided by the present invention, method and system can be by flight time image acquisition modules
Install handover module additional, and do not increasing under video camera quantity and device, system dimension, reach quick patch color and seamless paste
The effect of color.
Content disclosed above is only the preferred possible embodiments of the present invention, and the right for not thereby limiting to the present invention will
The protection domain asked, therefore all equivalence techniques changes done with description of the invention and accompanying drawing content, are both contained in the present invention
Scope of the claims in.
Claims (13)
1. a kind of device for being used to catch stereopsis, it is characterised in that the device for being used to catch stereopsis includes:
One flight time image acquisition module, it is used for the stereopsis information for capturing an at least object;And
One light emitting module, it is adjacent to the flight time image acquisition module, wherein the light emitting module, which produces, invests the thing
One structure light source of body and a lighting source;
Wherein, the structure light source is by the reflection of the object, and to form one first reflection source, the lighting source passes through
The reflection of the object, to form one second reflection source, and flight time image acquisition module acquisition described first is anti-
Light source and second reflection source are penetrated, to obtain the stereopsis information of the object.
2. the device according to claim 1 for being used to catch stereopsis, it is characterised in that the light emitting module includes one
Light generation unit, an x-ray diffraction component and a Photodiffusion component, the structure light source caused by the light emitting module is logical
Formed after crossing the conversion of the x-ray diffraction component, and the lighting source caused by the light emitting module is by the light
Formed after the conversion of diffusion component.
3. the device according to claim 2 for being used to catch stereopsis, it is characterised in that the smooth generation unit includes
One is used for the laser generator and an optical module of one LASER Light Source of generation, and the LASER Light Source sequentially passes through the optics group
Part and the x-ray diffraction component, to form the structure light source, and the LASER Light Source sequentially by the optical module with
And the Photodiffusion component, to form the lighting source.
4. the device according to claim 3 for being used to catch stereopsis, it is characterised in that the laser generator has
An at least collimator, the optical module include a spectrum groupware and a reflecting assembly, and the LASER Light Source is sequentially by extremely
The collimating of a few collimator, the light splitting of the spectrum groupware and the conversion of the x-ray diffraction component, to form the knot
Structure light source, and the LASER Light Source is sequentially by the collimating of at least one collimator, being divided of the spectrum groupware, described anti-
The reflection of optical assembly and the conversion of the Photodiffusion component, to form the lighting source.
5. the device according to claim 3 for being used to catch stereopsis, it is characterised in that the smooth generation unit includes
One laser generator and a luminescence component, a LASER Light Source passes through the x-ray diffraction component caused by the laser generator
To be converted to the structure light source, and a projection source caused by the luminescence component by the Photodiffusion component to change
For the lighting source.
6. the device according to claim 3 for being used to catch stereopsis, it is characterised in that the flight time image is picked
Modulus block, the x-ray diffraction component and the Photodiffusion component are linearly set, and the flight time image acquisition module
A light receiving surface, the x-ray diffraction component an exiting surface and the Photodiffusion component an exiting surface all along same base
Fiducial axis line arranges.
7. according to claim 3 be used to catch the device of stereopsis, it is characterised in that the x-ray diffraction component and
The Photodiffusion component is arranged on the same side of the flight time image acquisition module, and the Photodiffusion component is arranged at institute
State between x-ray diffraction component and the flight time image acquisition module.
8. according to claim 3 be used to catch the device of stereopsis, it is characterised in that the x-ray diffraction component and
The Photodiffusion component is separately positioned on the both sides of the flight time image acquisition module.
9. the device according to claim 3 for being used to catch stereopsis, it is characterised in that the flight time image sense
Survey module and also include a handover module, the handover module includes:
One infrared light band logical filter;
One visible ray band logical filter;And
One light shifter, it is used to both first reflection source and second reflection source being directed to the infrared light
Band logical filter or by an environment light source be irradiated in the object caused by reflected light be directed to the visible ray band logical filter;
Wherein, when first reflection source and second reflection source by the switching of the light shifter to be directed to
During the infrared light band logical filter, first reflection source and second reflection source are filtered by the infrared light band logical
Mirror, to produce a black and white stereopsis information;
Wherein, the switching of the light shifter is passed through when the environment light source is irradiated in the reflected light caused by the object
During being directed to the visible ray band logical filter, the environment light source is irradiated in the reflected light caused by the object and passed through
The visible ray band logical filter, to obtain a chromatic image information.
10. a kind of system for catching stereopsis, it is used for the image information for capturing an at least object, and its feature exists
In the system for being used to catch stereopsis includes:
One processor module;
One flight time processing module, it is electrically connected at the processor module;
One light emitting module, to produce the structure light source for investing the object and the lighting source for investing the object,
Wherein described structure light source is by the reflection of the object to produce a structure optical information, and the lighting source passes through the thing
The reflection of body, to produce an illumination optical information;And
One flight time image acquisition module, it is electrically connected at the flight time processing module, for capturing the knot
Structure optical information and the illumination optical information;
Wherein, the structure optical information that the flight time image acquisition module captures is transported by the processor module
The three-dimensional point cloud of a structure light is can obtain after calculation;
Wherein, the illumination optical information that the flight time image acquisition module captures is transported by the processor module
A flight time three-dimensional point cloud is can obtain after calculation;
Wherein, the three-dimensional point cloud of the structure light and the flight time three-dimensional point cloud can be merged into one and be used to provide the image
The three-dimensional depth figure of information.
11. the system according to claim 10 for catching stereopsis, it is characterised in that the light emitting module includes
One smooth generation unit, an x-ray diffraction component and a Photodiffusion component, the structure light source caused by the light emitting module are
Formed after conversion by the x-ray diffraction component, and the lighting source is by described caused by the light emitting module
Formed after the conversion of x-ray diffraction component.
12. the system according to claim 10 for catching stereopsis, it is characterised in that described to be used to catch solid
The system of image also includes a memory cell, wherein, the three-dimensional point cloud of the structure light, the flight time three-dimensional point cloud and
The three-dimensional depth figure is stored in the memory cell.
13. the system according to claim 10 for catching stereopsis, it is characterised in that described to be used to catch solid
The system of image also includes a computing unit, and it is electrically connected at the processor module, for entering to the three-dimensional depth figure
Row correction.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610737230.5A CN107783353B (en) | 2016-08-26 | 2016-08-26 | Device and system for capturing three-dimensional image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610737230.5A CN107783353B (en) | 2016-08-26 | 2016-08-26 | Device and system for capturing three-dimensional image |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107783353A true CN107783353A (en) | 2018-03-09 |
CN107783353B CN107783353B (en) | 2020-07-10 |
Family
ID=61439738
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610737230.5A Active CN107783353B (en) | 2016-08-26 | 2016-08-26 | Device and system for capturing three-dimensional image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107783353B (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110278429A (en) * | 2018-03-18 | 2019-09-24 | 宁波舜宇光电信息有限公司 | Depth information camera module and its base assembly, electronic equipment and preparation method |
CN110471193A (en) * | 2018-05-10 | 2019-11-19 | 视锐光科技股份有限公司 | The integrated structure of floodlight illuminator and dot matrix projector |
WO2019218265A1 (en) * | 2018-05-16 | 2019-11-21 | Lu Kuanyu | Multi-spectrum high-precision method for identifying objects |
CN110611755A (en) * | 2018-05-29 | 2019-12-24 | 广州印芯半导体技术有限公司 | Image sensing system and multifunctional image sensor thereof |
CN111083453A (en) * | 2018-10-18 | 2020-04-28 | 中兴通讯股份有限公司 | Projection device, method and computer readable storage medium |
CN111322961A (en) * | 2019-03-21 | 2020-06-23 | 深圳市光鉴科技有限公司 | System and method for enhancing time-of-flight resolution |
WO2020187175A1 (en) * | 2019-03-21 | 2020-09-24 | 深圳市光鉴科技有限公司 | Light projection system and light projection method |
CN112424673A (en) * | 2018-08-24 | 2021-02-26 | Oppo广东移动通信有限公司 | Infrared projector, imaging device and terminal device |
CN112887697A (en) * | 2021-01-21 | 2021-06-01 | 北京华捷艾米科技有限公司 | Image processing method and system |
CN113253475A (en) * | 2019-01-25 | 2021-08-13 | 深圳市光鉴科技有限公司 | Switchable diffuser projection system and method |
US11423557B2 (en) | 2018-06-28 | 2022-08-23 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Depth processor and three-dimensional image device |
US11422262B2 (en) | 2019-01-15 | 2022-08-23 | Shenzhen Guangjian Technology Co., Ltd. | Switchable diffuser projection systems and methods |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013138434A (en) * | 2011-12-27 | 2013-07-11 | Thomson Licensing | Device for acquisition of stereoscopic images |
CN103959089A (en) * | 2012-11-21 | 2014-07-30 | Lsi公司 | Depth imaging method and apparatus with adaptive illumination of an object of interest |
CN104769389A (en) * | 2012-11-05 | 2015-07-08 | 赫克斯冈技术中心 | Method and device for determining three-dimensional coordinates of an object |
CN104903677A (en) * | 2012-12-17 | 2015-09-09 | Lsi公司 | Methods and apparatus for merging depth images generated using distinct depth imaging techniques |
CN205193425U (en) * | 2015-08-07 | 2016-04-27 | 高准精密工业股份有限公司 | Optical device |
-
2016
- 2016-08-26 CN CN201610737230.5A patent/CN107783353B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013138434A (en) * | 2011-12-27 | 2013-07-11 | Thomson Licensing | Device for acquisition of stereoscopic images |
CN104769389A (en) * | 2012-11-05 | 2015-07-08 | 赫克斯冈技术中心 | Method and device for determining three-dimensional coordinates of an object |
CN103959089A (en) * | 2012-11-21 | 2014-07-30 | Lsi公司 | Depth imaging method and apparatus with adaptive illumination of an object of interest |
CN104903677A (en) * | 2012-12-17 | 2015-09-09 | Lsi公司 | Methods and apparatus for merging depth images generated using distinct depth imaging techniques |
CN205193425U (en) * | 2015-08-07 | 2016-04-27 | 高准精密工业股份有限公司 | Optical device |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110278429A (en) * | 2018-03-18 | 2019-09-24 | 宁波舜宇光电信息有限公司 | Depth information camera module and its base assembly, electronic equipment and preparation method |
US11493605B2 (en) | 2018-03-18 | 2022-11-08 | Ningbo Sunny Opotech Co., Ltd. | Depth information camera module and base assembly, projection assembly, electronic device and manufacturing method thereof |
CN110278429B (en) * | 2018-03-18 | 2021-09-21 | 宁波舜宇光电信息有限公司 | Depth information camera module, base assembly thereof, electronic equipment and preparation method |
CN110471193A (en) * | 2018-05-10 | 2019-11-19 | 视锐光科技股份有限公司 | The integrated structure of floodlight illuminator and dot matrix projector |
WO2019218265A1 (en) * | 2018-05-16 | 2019-11-21 | Lu Kuanyu | Multi-spectrum high-precision method for identifying objects |
CN110611755A (en) * | 2018-05-29 | 2019-12-24 | 广州印芯半导体技术有限公司 | Image sensing system and multifunctional image sensor thereof |
US11423557B2 (en) | 2018-06-28 | 2022-08-23 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Depth processor and three-dimensional image device |
CN112424673A (en) * | 2018-08-24 | 2021-02-26 | Oppo广东移动通信有限公司 | Infrared projector, imaging device and terminal device |
CN111083453A (en) * | 2018-10-18 | 2020-04-28 | 中兴通讯股份有限公司 | Projection device, method and computer readable storage medium |
CN111083453B (en) * | 2018-10-18 | 2023-01-31 | 中兴通讯股份有限公司 | Projection device, method and computer readable storage medium |
US11422262B2 (en) | 2019-01-15 | 2022-08-23 | Shenzhen Guangjian Technology Co., Ltd. | Switchable diffuser projection systems and methods |
CN113253475A (en) * | 2019-01-25 | 2021-08-13 | 深圳市光鉴科技有限公司 | Switchable diffuser projection system and method |
WO2020187175A1 (en) * | 2019-03-21 | 2020-09-24 | 深圳市光鉴科技有限公司 | Light projection system and light projection method |
EP3944000A4 (en) * | 2019-03-21 | 2022-05-04 | Shenzhen Guangjian Technology Co., Ltd. | System and method for enhancing time-of-flight resolution |
WO2020187176A1 (en) * | 2019-03-21 | 2020-09-24 | 深圳市光鉴科技有限公司 | System and method for enhancing time-of-flight resolution |
CN111322961A (en) * | 2019-03-21 | 2020-06-23 | 深圳市光鉴科技有限公司 | System and method for enhancing time-of-flight resolution |
CN112887697A (en) * | 2021-01-21 | 2021-06-01 | 北京华捷艾米科技有限公司 | Image processing method and system |
CN112887697B (en) * | 2021-01-21 | 2022-06-10 | 北京华捷艾米科技有限公司 | Image processing method and system |
Also Published As
Publication number | Publication date |
---|---|
CN107783353B (en) | 2020-07-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107783353A (en) | For catching the apparatus and system of stereopsis | |
US11067692B2 (en) | Detector for determining a position of at least one object | |
US10401143B2 (en) | Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device | |
CN104634276B (en) | Three-dimension measuring system, capture apparatus and method, depth computing method and equipment | |
US10823818B2 (en) | Detector for optically detecting at least one object | |
US10088296B2 (en) | Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device | |
US20160134860A1 (en) | Multiple template improved 3d modeling of imaged objects using camera position and pose to obtain accuracy | |
US20170244953A1 (en) | Imaging optical system and 3d image acquisition apparatus including the imaging optical system | |
US20140307100A1 (en) | Orthographic image capture system | |
EP3069100B1 (en) | 3d mapping device | |
CN108718406B (en) | Variable-focus 3D depth camera and imaging method thereof | |
CN102438111A (en) | Three-dimensional measurement chip and system based on double-array image sensor | |
CN103486979A (en) | Hybrid sensor | |
CN108592886A (en) | Image capture device and image-pickup method | |
CN102865849A (en) | Camera device for ranging and ranging method | |
CN202915911U (en) | Shooting device for distance measurement | |
CN108534703A (en) | Structured light generators and depth data measurement head and measuring device | |
US11326874B2 (en) | Structured light projection optical system for obtaining 3D data of object surface | |
JP3818028B2 (en) | 3D image capturing apparatus and 3D image capturing method | |
TWI630431B (en) | Device and system for capturing 3-d images | |
CN208536839U (en) | Image capture device | |
CN207650834U (en) | Face information measurement assembly | |
CN208505256U (en) | Structured light generators | |
JP2005331413A (en) | Distance image acquiring system | |
CN116753861A (en) | Three-dimensional reconstruction system and three-dimensional reconstruction method based on multi-wavelength super-surface element |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20230506 Address after: No. 25, spectrum West Road, Science City, Guangzhou high tech Industrial Development Zone, Guangzhou, Guangdong Patentee after: LUXVISIONS INNOVATION Ltd. Address before: 510730 Guangdong Guangzhou hi tech Industrial Development Zone, 25 West Road, science city. Patentee before: LITE-ON ELECTRONICS (GUANGZHOU) Ltd. Patentee before: Lite-On Technology Co.,Ltd. |
|
TR01 | Transfer of patent right |