CN206807664U - Depth image obtains system - Google Patents
Depth image obtains system Download PDFInfo
- Publication number
- CN206807664U CN206807664U CN201720235697.XU CN201720235697U CN206807664U CN 206807664 U CN206807664 U CN 206807664U CN 201720235697 U CN201720235697 U CN 201720235697U CN 206807664 U CN206807664 U CN 206807664U
- Authority
- CN
- China
- Prior art keywords
- image
- optical
- depth image
- depth
- obtains system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Landscapes
- Length Measuring Devices By Optical Means (AREA)
Abstract
The utility model discloses a kind of depth image to obtain system.The depth image, which obtains system, to be included:Optical projection unit, including at least two optical projectors;At least two optical projector is used for the structure light image for launching respective wavelength;Image acquisition units, including optical filter and imaging sensor;Processor unit, for receiving optical imagery, and it is processed to obtain depth image.The beneficial effects of the utility model are:A kind of depth image is provided and obtains system, optical projection unit is used for the structure light image for launching at least two wavelength;The synchronous acquisition of different wave length image is realized using image acquisition units, processor unit, which obtains the optical imagery and handled, obtains the depth image of no parallax, depth image can correspond to the depth image of different angle respectively to eliminate shadow problem caused by single width depth image, can also correspond to the depth image of different distance respectively to realize the measurement of bigger depth bounds.
Description
Technical field
Optical projection and field of measuring technique are the utility model is related to, more particularly to a kind of depth image obtains system.
Background technology
Depth camera can be used for obtaining the depth image of object, can further carry out 3D modeling, skeletal extraction etc.,
The field such as 3D measurements and man-machine interaction has a very wide range of applications.The one kind of structure light depth camera as depth camera,
Because its cost is low, imaging resolution is high etc., advantage is most widely used at present, nevertheless, still suffering from some problems.Depth
The measurement range of camera is limited, and measurement accuracy can exponentially decline with measurement distance;Single-throw shadow commonly used at present
Module adds the depth image of the depth camera of single imaging camera composition often to have shadow region.The depth image that depth camera obtains
These application generation negative impacts to depth camera be present, particularly have to measurement range, measurement accuracy etc. higher
It is required that application.
The content of the invention
The utility model is smart in order to solve to lack in the prior art the shadow region for obtaining depth information and solution measurement
The problem of hardware foundation that degree drastically increases with measurement distance, there is provided a kind of depth image obtains system.
In order to solve the above problems, the utility model adopts the following technical scheme that:
A kind of depth image obtains system, including:
Optical projection unit, including at least two optical projectors;At least two optical projector is used to launch respectively
From the structure light image of wavelength;
Image acquisition units, including optical filter and imaging sensor;The optical filter includes at least two filter units
Respectively allow for the light launched by least two optical projector;Described image sensor, which is used to receive, passes through the filter
The light of mating plate is converted into optical imagery and sends the optical imagery to processor unit;
Processor unit, for receiving the optical imagery, and depth image is calculated.
Preferably, in addition to memory cell, for storing the depth image.
Preferably, the processor unit includes:One or more processors;Memory;And one or more programs,
It is stored in the memory, and is configured to by one or more of computing devices, and described program includes being used for
Perform the instruction of following steps:Receive the optical imagery;The optical imagery is calculated at least two projecting apparatus pair
The structure light image answered;Corresponding depth image is calculated using at least two structure light image.
Preferably, the processor unit is additionally operable to control the projection of the optical projection unit and/or described image to adopt
Collect unit and carry out IMAQ.
Preferably, at least two structure light image is in terms of wavelength, light intensity, pattern density, at least one aspect
It is different.
Preferably, at least two optical projector is arranged on same plane with described image collecting unit;It is described extremely
Few the distance between two optical projectors and described image collecting unit difference.
Preferably, the optical projector light source is VCSEL array laser.
Preferably, described image sensor is CMOS or CCD.
Preferably, the processor unit includes AP processors.
The beneficial effects of the utility model are:A kind of depth image is provided and obtains system, optical projection unit is used to launch
The structure light image of at least two wavelength;The synchronous acquisition of different wave length image, depth image are realized using image acquisition units
The depth image of different angle can be corresponded to respectively to eliminate shadow problem caused by single width depth image, can also be corresponded to respectively
The depth image of different distance is to realize the measurement of bigger depth bounds, to obtain the shadow region of depth information and solution
Measurement accuracy provides hardware foundation with the problem of measurement distance drastically increases.
Brief description of the drawings
Fig. 1 is the schematic diagram that the image-taking system of the utility model embodiment 1 is placed in mobile device.
Fig. 2 is that the depth image of the utility model embodiment 2 obtains system schematic.
Fig. 3 is the image acquisition units schematic diagram of the utility model embodiment 1 and 2.
Fig. 4 is the schematic diagram of the filter unit of the image acquisition units of the utility model embodiment 3.
Fig. 5 is the processor unit processing image process schematic diagram of the utility model embodiment 4.
Fig. 6 is the method schematic diagram of the acquisition depth image of the utility model embodiment 1,2,3 and 4.
Embodiment
The utility model is described in detail by specific embodiment below in conjunction with the accompanying drawings, for a better understanding of this
Utility model, but following embodiments are not intended to limit the scope of the utility model.In addition, it is necessary to illustrate, institute in following embodiments
The diagram of offer only illustrates basic conception of the present utility model in a schematic way, in accompanying drawing only display with it is relevant in the utility model
Component rather than drawn according to component count, shape and the size during actual implement, the shape of each component, number during its actual implementation
Amount and ratio can be a kind of random change, and its assembly layout form may also be increasingly complex.
Embodiment 1
It is this as shown in figure 1, being the schematic diagram that is placed in mobile device of image-taking system of the utility model embodiment
Depth image described in utility model obtains concrete application of the system as mobile device built-in unit.Depth image obtains system
Be embedded in as an embedded single component in mobile device 4, including the first optical projector 1, image acquisition units 2,
Second optical projector 3, the processor applied are the AP processors in mobile device.In the present embodiment, described movement
Equipment 4 is mobile phone;Depth image obtain system embedment position be mobile device 4 top, the first optical projector 1, image
Collecting unit 2, the second optical projector 3 are arranged on same plane;At least two optical projector gathers with described image
The distance between unit difference.The mobile device 4 that embedded images obtain system can be used for the depth image for obtaining target, enter one
Step can be used for carrying out the applications such as 3D scannings, 3D modeling, 3D identifications.
In some alternative embodiments of the present embodiment, above-mentioned mobile device 4 can also be PAD, computer, intelligent television
Deng;Embedded position can also be other positions, such as side, bottom, back side etc..
As shown in fig. 6, the mobile device 4 that the present embodiment embedded images obtain system obtains the method for depth image including such as
Lower step:
(1) first optical projector 1 is used for the first structure light image for launching first wave length, and second optics is thrown
Shadow instrument 3 is used for the second structure light image for launching second wave length;The first wave length and second wave length are infrared for different wave length
Light;The first structure light image is different with the light intensity of second structure light image;The first structure light image and described
The pattern density of second structure light image is different.
Wherein, in some alternative embodiments of the present embodiment, structure light image can be such as infrared, ultraviolet light figure
Picture;The species of structure light is also more, such as speckle, striped etc.;The light source of first optical projector 1 and the second optical projector 3
Can be VCSEL array laser.
First optical projector 1, image acquisition units 2, the second optical projector 3 are configured on same baseline, the
One optical projector 1 and the second optical projector 3 can be located at the both sides of image acquisition units 2, and the first optical projection respectively
The distance between instrument 1 and image acquisition units 2 are greater than the distance between the second optical projector 3 and image acquisition units 2.
In some alternative embodiments of the present embodiment, the first optical projector 1, image acquisition units 2, the second optics are thrown
The mutual position of shadow instrument 3 can not limit;Or other described first optical projectors 1 and second optics are thrown
The distance of shadow instrument 3 to described image collecting unit 2 is different to be set.
(2) as shown in figure 3, described image collecting unit 2 includes filter unit 21 and image sensor cell 22;It is described
Filter unit 21 includes the first filter unit and the second filter unit and respectively allowed for by the first wave length and the
The light of two wavelength;Described image sensor unit 22 is used to obtain optical imagery and send the optical imagery to processor list
Member.Point in space is imaged in the pixel of imaging sensor after being focused on by light 6 via lens 7, and imaging sensor is used for
Light intensity is changed into corresponding data signal.Image acquisition units 2 in depth image acquisition system only have one, for synchronization
Gather the structure light image of the first optical projector 1 and the second optical projector 3.
In the alternative embodiments of the present embodiment, imaging sensor can be CMOS or CCD.
(3) processor unit used in the present embodiment is that the AP processors in mobile device 4 are used to receive the optics
Image, and be processed to, depth image is calculated.
In some alternative embodiments of the present embodiment, processor unit can also include multiple processors, such as by special
The AP processors that door is used in the Special SOC chip and mobile device that depth obtains, wherein special SOC chips are used to calculate
Depth image, and AP processors then can be used for the functions such as image co-registration.
The processor unit includes:One or more processors;Memory;And one or more programs, it is deposited
Storage is configured to by one or more of computing devices in the memory, described program include being used for performing with
The instruction of lower step:Receive the optical imagery;The optical imagery is calculated into first structure light image and the second structure light
Image;The first depth image is calculated using first wave length structure light image, is calculated using second wave length structure light image
Two depth images.
The processor unit is additionally operable to control the projection of the optical projection unit and described image collecting unit is carried out
IMAQ.
In the alternative embodiments of the present embodiment, the processor unit is additionally operable to control the throwing of the optical projection unit
Shadow or described image collecting unit carry out IMAQ.
The depth image passes through the deviation to calculating a pixel between the structure light image and reference configuration light image
It is worth, and calculates the depth value of each pixel using trigonometry principle according to deviation value;The reference configuration light image is to exist in advance
In the structure light image gathered in the plane on described image collecting unit known distance.
The calculation procedure of the processor unit is additionally operable to merge first depth image and second depth image
Obtain the 3rd depth image.
The fusion includes:Using the described first or second depth image as reference, with the described second or first depth image
In effective depth value substitute corresponding depth value in the described first or second depth image, the effective depth value refers to the
One or second pixel value in depth image be cavity and in the second or first depth image be not the depth value in the pixel in cavity.
The fusion includes:By respective pixel value weighted average in first depth image and second depth image
Pixel value of the pixel value afterwards as depth image after fusion.
The fusion includes:Utilize respective pixel value meter described in first depth image and second depth image
The pixel value of sub-pix is calculated to improve the resolution ratio of depth image.
Processor unit described above processing, the method for obtaining the depth image is calculated, according to being actually needed, Ke Yiquan
Portion uses and can also partly used.
Embodiment 2
As shown in Fig. 2 it is that the depth image of the present embodiment obtains the schematic diagram of system.Depth image obtains system as independently
Equipment, including the first optical projector 1, image acquisition units 2, the second optical projector 3 and processor unit 5.
As shown in fig. 6, the method that depth image obtains system acquisition depth image comprises the following steps:
(1) optical projection unit includes the first optical projector 1 and the second optical projector 3;First optical projection
Instrument 1 is used for the first structure light image for launching first wave length, and second optical projector 3 is used to launch the second of second wave length
Structure light image;
(2) image acquisition units 2 include filter unit 21 and image sensor cell 22;The filter unit 21 is wrapped
Include the first filter unit and the second filter unit and respectively allow for the light by the first wave length and second wave length;It is described
Image sensor cell 22 is used to obtain optical imagery and send the optical imagery to processor unit;
(3) processor unit 5 is used to receive the optical imagery, and is processed to, depth image is calculated.
Different from embodiment 1, depth image obtains system as autonomous device in the present embodiment, is set by interface and other
Standby to be connected to input/output data, interface here is USB interface.In the present embodiment, depth image obtains system and also wrapped
Memory cell is included, for storing the depth image obtained.
In the alternative embodiments of the present embodiment, input/output data can also pass through other kinds of interface, WIFI
Deng.
Embodiment 3
As shown in figure 4, it is the schematic diagram of the filter unit of the image acquisition units of the utility model embodiment.Common
The Baeyer optical filter that RGB camera uses, optical filter possess identical with image sensor pixel quantity and filtered correspondingly single
Member, Baeyer optical filter have for the filter unit by feux rouges, green glow and blue light respectively, and consider human eye to green glow more
Sensitivity, thus the ratio of three is R (25%):G (50%):B (25%).And in the present embodiment, depth image obtains system bag
Include the first optical projector 1, image acquisition units 2, the second optical projector 3 and processor unit 5.Wherein, IMAQ list
The filter unit 21 of member 2 is made up of two parts, wherein IR1 and IR2 two kinds of infrared lights that to represent wavelength different, IR1 correspondences
Pixel will can collect the infrared image of IR1 wavelength, pixel will collect the infrared image of IR2 wavelength corresponding to IR2.The
One optical projector 1 launches IR1 infrared lights, and the second optical projector 3 is used to launch IR2 infrared structure light, therefore image sensing
The structure optical information launched containing the first optical projector 1 and the second optical projector 3 is have recorded on device 22 simultaneously.Due to every
A kind of information all only occupies the pixel of part, and the ratio of two kinds of information is 1 in the present embodiment:1, it is necessary to by way of interpolation
Recover the strength information of another component in each pixel, so as to finally realize the synchronous complete first structure light image of acquisition
With the second structure light image.Interpolation uses average weighted method.
In the alternative embodiments of the present embodiment, can use other interpolation method, due to for prior art thus
Here it is not described in detail.
As shown in fig. 6, it is that the present embodiment depth image obtains the method that system obtains depth image.
In the alternative embodiments of the present embodiment, a kind of computer-readable recording medium be present, it is stored with and depth map
The computer program being used in combination as obtaining equipment, the computer program is executed by processor described in the utility model to realize
Any methods described.
In the alternative embodiments of the present embodiment, the first optical projector 1 and the second optical projector 3 launch respectively it is near,
Far red light, therefore IR1, IR2 of optical filter are then respectively used to obtain near-infrared image and far infrared image.It should be noted
It is, in other alternative embodiments of the present utility model, it is therefore possible to use the combination and application of any other wavelength.
Embodiment 4
As shown in figure 5, it is the schematic diagram that image is handled according to the processor unit of one embodiment of the present utility model.It is deep
Degree image-taking system includes the first optical projector 1, image acquisition units 2, the second optical projector 3 and processor unit 5.
The method that the processor unit 5 manages the optical imagery includes:By the optical imagery calculate first structure light image and
Second structure light image;Obtaining described depth image includes:The first depth map is calculated using first wave length structure light image
Picture;The second depth image is calculated using second wave length structure light image.
The optical imagery of two kinds of wavelength (such as near-infrared, far red light) is included by imaging sensor 22 first;Secondly should
Optical imagery is output to processor unit 5, and the optical imagery is divided into two by processor unit 5, i.e., is thrown comprising the first optics
The first structure light image for the structure optical information that shadow instrument 1 is launched and the structure optical information for including the transmitting of the second optical projector 3
The second structure light image;Wherein first and second depth map further will be calculated by processor unit in structure light image
Picture;First and second depth image is finally fused into the 3rd depth image and exported;First depth image and the second depth map
As can also individually be exported.
The principle that depth image is calculated by structure light image is structure light trigonometry principle.By taking speckle image as an example, in advance
First need to be reference picture to the structure light image in one width known depth plane of collection, then processor unit 5 is using currently
The structure light image and reference picture of acquisition, the deviation value (deformation) of each pixel is calculated by image matching algorithm, last profit
Depth can be calculated with trigonometry principle, calculation formula is as follows:
Wherein, ZDRefer to the depth value of three dimensions point distance collection module, that is, depth data to be asked, B is collection camera
The distance between structured light projection instrument, Z0For depth value of the reference picture from collection module, f is Jiao of lens in collection camera
Away from.
The difference configured according to optical projector, the specific method of above-mentioned image procossing are also had any different.
As shown in fig. 6, it is that the present embodiment depth image obtains the method that system obtains depth image.
In a kind of alternate embodiment of the present embodiment, structured light patterns intensity that the first optical projector 1 is projected
And density is all higher than the second optical projector 3, the distance between the first optical projector 1 and described image collecting unit 2 in addition
Also greater than the second optical projector, the purpose being configured so that is, first structure light image can include more remote mesh
Logo image, possess more preferable structure light feature simultaneously for remote target, thus directed towards longer-distance object, Ke Yiyou
Processor unit 5 obtains more accurate first depth information;And second that the second structure light image is only capable of obtaining closely is deep
Spend information, it is for remote depth information it is possible that empty phenomena such as.Due to first structure light image and the second structure
Light image is obtained by same imaging sensor, thus does not have parallax therebetween, therefore the first obtained depth image
Pixel between the second depth image is also one-to-one, according to the depth of more remote object in foregoing first depth image
Spend information more accurately and reliably, and the depth information of closer distance object more accurately and reliably, therefore can in the second depth image
So that this two amplitude deepness image to be merged.
A kind of amalgamation mode is:A depth threshold is chosen first, for each pixel, judges the first depth image and the
Whether the pixel value in two depth images reaches the depth threshold, if being less than the threshold value, chooses in the second depth image
Pixel value of the pixel value as the pixel, it is on the contrary then choose the first depth image.It is deep that the 3rd can be obtained after the fusion
Image is spent, each pixel in the 3rd depth image will possess the precision higher than first and second depth image.
Another amalgamation mode is:A weighted average scheme is selected, i.e., by the weighted average scheme by the first depth
Image pixel corresponding with the second depth image, which is weighted, averagely obtains the 3rd higher depth image of precision.Weight coefficient
Can be variable, for example for object closely, the pixel value in the second depth image will possess higher weight.
Another amalgamation mode is:The higher image of the current collection camera sensor resolution of a ratio is created, according to the
Pixel in one depth image and the second depth image calculates the pixel value for creating each pixel in image one by one, may finally obtain
Take the depth image of more high resolution.For example, using the first depth image as reference picture, counted with reference to the second depth image
Calculate the 1/2 of the first depth image, the value of 1/4 grade sub-pix, so as to improve the resolution ratio of depth image.
In another embodiment, the first optical projector 1 and the second optical projector 3 are located at image acquisition units respectively
2 both sides, for a certain by object, it is possible that the subregion in following phenomenon, i.e. the first depth image on the left of object
Depth information can not obtain, and the depth information of the subregion in the second depth image on the right side of object can not obtain.This
Phenomenon generally existing, reason in the depth camera being made up of single optical projector and single image collecting unit are due to
Object is because projection causes raised side can not to be irradiated to by optical projector, similar to the shadow region in illumination optical.Pin
To this situation, it is possible to which the first depth image and the second depth image are carried out into pixel value complementation, the 3rd depth after complementation
Would not occur depth information in image for empty shadow region.
In some alternative embodiments of embodiment 1,2,3 or 4, image-taking system can be according to being actually needed including more
Individual optical projector, such as three or four etc.;The optical projector space set be not specifically limited, its application with
Above-described embodiment essential concept is identical, therefore repeats no more.It should be noted that optical projector quantity is different and specific
The difference of set-up mode;The optical filter quantity of corresponding image acquisition units has difference, and final purpose is to ensure own
The light that optical projector projects can by optical filter, and by imaging sensor be used for receive all light by optical filter
It is converted into optical imagery and sends the optical imagery to processor unit;Corresponding processor unit obtains optical imagery simultaneously
Depth image corresponding to each structure light image being calculated, and can further carry out the fusion of depth image, depth map
Its different specific amalgamation mode of the quantity of picture can be slightly different, but belong to the scope that the utility model is protected;Use this
Depth image described in utility model obtains system, and the image for being arranged as required to multiple optical projectors launches multiple wavelength
Structure light image;The synchronous acquisition of different wave length image is realized using image acquisition units, processor unit obtains the optics
Image simultaneously handles and obtains the depth image of no parallax, and each depth image can correspond to the depth image of different angle respectively to disappear
Except shadow problem caused by single width depth image, the depth image of different distance can also be corresponded to respectively to realize bigger depth model
The measurement enclosed, the scope that the utility model protected also is should be regarded as the otherwise concrete application of particular problem.
Above content is to combine specific preferred embodiment further detailed description of the utility model, it is impossible to
Assert that specific implementation of the present utility model is confined to these explanations.For the utility model person of ordinary skill in the field
For, without departing from the concept of the premise utility, some equivalent substitutes or obvious modification, and performance can also be made
Or purposes is identical, the scope of protection of the utility model should be all considered as belonging to.
Claims (7)
1. a kind of depth image obtains system, it is characterised in that including:
Optical projection unit, including at least two optical projectors;At least two optical projector is used to launch respective ripple
Long structure light image;
Image acquisition units, including optical filter and imaging sensor;The optical filter is distinguished including at least two filter units
Allow the light launched by least two optical projector;Described image sensor, which is used to receive, passes through the optical filter
Light be converted into optical imagery and send the optical imagery to processor unit;
Processor unit, for receiving the optical imagery, the optical imagery is calculated at least two projecting apparatus pair
The structure light image answered, and depth image is calculated.
2. depth image as claimed in claim 1 obtains system, it is characterised in that also including memory cell, for storing
State depth image.
3. depth image as claimed in claim 1 obtains system, it is characterised in that at least two structure light image is in ripple
In terms of length, light intensity, pattern density, at least one aspect is different.
4. depth image as claimed in claim 1 obtains system, it is characterised in that at least two optical projector and institute
State image acquisition units and be arranged on same plane;Between at least two optical projectors and described image collecting unit away from
From difference.
5. depth image as claimed in claim 1 obtains system, it is characterised in that the optical projector light source is VCSEL
Array laser.
6. depth image as claimed in claim 1 obtains system, it is characterised in that described image sensor is CMOS or CCD.
7. depth image as claimed in claim 1 obtains system, it is characterised in that the processor unit includes AP processing
Device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201720235697.XU CN206807664U (en) | 2017-03-09 | 2017-03-09 | Depth image obtains system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201720235697.XU CN206807664U (en) | 2017-03-09 | 2017-03-09 | Depth image obtains system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN206807664U true CN206807664U (en) | 2017-12-26 |
Family
ID=60729913
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201720235697.XU Active CN206807664U (en) | 2017-03-09 | 2017-03-09 | Depth image obtains system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN206807664U (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106954058A (en) * | 2017-03-09 | 2017-07-14 | 深圳奥比中光科技有限公司 | Depth image obtains system and method |
CN109277015A (en) * | 2018-08-30 | 2019-01-29 | 东莞市闻誉实业有限公司 | Raw material stirs lighting device |
WO2020093294A1 (en) * | 2018-11-08 | 2020-05-14 | 深圳市汇顶科技股份有限公司 | Vertical cavity surface light-emitting laser, structured light module, and light projection method and terminal |
-
2017
- 2017-03-09 CN CN201720235697.XU patent/CN206807664U/en active Active
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106954058A (en) * | 2017-03-09 | 2017-07-14 | 深圳奥比中光科技有限公司 | Depth image obtains system and method |
CN106954058B (en) * | 2017-03-09 | 2019-05-10 | 深圳奥比中光科技有限公司 | Depth image obtains system and method |
CN109277015A (en) * | 2018-08-30 | 2019-01-29 | 东莞市闻誉实业有限公司 | Raw material stirs lighting device |
WO2020093294A1 (en) * | 2018-11-08 | 2020-05-14 | 深圳市汇顶科技股份有限公司 | Vertical cavity surface light-emitting laser, structured light module, and light projection method and terminal |
US10971900B2 (en) | 2018-11-08 | 2021-04-06 | Shenzhen GOODIX Technology Co., Ltd. | Vertical-cavity surface-emitting laser, structured light module and method for light projection and terminal |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106954058A (en) | Depth image obtains system and method | |
CN106454287B (en) | Combination shot system, mobile terminal and image processing method | |
JP7043085B2 (en) | Devices and methods for acquiring distance information from a viewpoint | |
CN104634276A (en) | Three-dimensional measuring system, photographing device, photographing method, depth calculation method and depth calculation device | |
CN206807664U (en) | Depth image obtains system | |
CN106934394A (en) | Double-wavelength images acquisition system and method | |
US20120134537A1 (en) | System and method for extracting three-dimensional coordinates | |
CN104395694B (en) | Motion sensor device with multiple light sources | |
CN102378015A (en) | Image capture using luminance and chrominance sensors | |
CN107483774A (en) | Filming apparatus and vehicle | |
CN107370950B (en) | Focusing process method, apparatus and mobile terminal | |
CN110533709B (en) | Depth image acquisition method, device and system and image acquisition equipment | |
CN107749070A (en) | The acquisition methods and acquisition device of depth information, gesture identification equipment | |
CN103379294B (en) | Image processing equipment, the picture pick-up device with this equipment and image processing method | |
CN102438111A (en) | Three-dimensional measurement chip and system based on double-array image sensor | |
CN107483845A (en) | Photographic method and its device | |
CN102980526A (en) | Three-dimensional scanister using black and white camera to obtain color image and scan method thereof | |
CN103188443B (en) | Video generation device, digital camera and method | |
CN206921118U (en) | Double-wavelength images acquisition system | |
CN106534633A (en) | Combined photographing system, mobile terminal and image processing method | |
CN109343238A (en) | A kind of compression ultrahigh speed camera based on electro-optic crystal deflection | |
CN110378971A (en) | A kind of detection method and device of image alignment precision, equipment, storage medium | |
CN107221025A (en) | A kind of synchronous system and method for obtaining body surface three-dimensional colour point clouds model | |
CN107710741A (en) | A kind of method and camera device for obtaining depth information | |
JP7300895B2 (en) | Image processing device, image processing method, program, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
CP01 | Change in the name or title of a patent holder | ||
CP01 | Change in the name or title of a patent holder |
Address after: A808, Zhongdi building, industry university research base, China University of Geosciences, No.8, Yuexing Third Road, Nanshan District, Shenzhen, Guangdong 518000 Patentee after: Obi Zhongguang Technology Group Co., Ltd Address before: A808, Zhongdi building, industry university research base, China University of Geosciences, No.8, Yuexing Third Road, Nanshan District, Shenzhen, Guangdong 518000 Patentee before: SHENZHEN ORBBEC Co.,Ltd. |