CN205193274U - Spatial information capture device - Google Patents

Spatial information capture device Download PDF

Info

Publication number
CN205193274U
CN205193274U CN201520934812.3U CN201520934812U CN205193274U CN 205193274 U CN205193274 U CN 205193274U CN 201520934812 U CN201520934812 U CN 201520934812U CN 205193274 U CN205193274 U CN 205193274U
Authority
CN
China
Prior art keywords
pattern
pumped fir
fir laser
laser part
spatial information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201520934812.3U
Other languages
Chinese (zh)
Inventor
陈志隆
颜智敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gaozhun International Technology Co., Ltd
Original Assignee
Everready Precision Ind Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Everready Precision Ind Corp filed Critical Everready Precision Ind Corp
Priority to CN201520934812.3U priority Critical patent/CN205193274U/en
Priority to US14/996,785 priority patent/US9977305B2/en
Application granted granted Critical
Publication of CN205193274U publication Critical patent/CN205193274U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The utility model provides a spatial information capture device, produce module, camera module and process control unit including the structured light. This structured light produces the module and sends a structured light pattern to a target article in order to form an estimation pattern in this target article. This camera module includes battery of lens, light code piece and induction element, and this light code piece has a benchmark pattern, and has at least partial corresponding pattern characteristic between this benchmark pattern and this the estimation pattern. This benchmark pattern and should estimate the pattern and all can throw to this induction element to this process control unit compares the pattern character difference between this benchmark pattern and this the estimation pattern, and then judges the spatial distance information of this target article. The utility model discloses simple structure is favorable to simplifying process control unit's processing procedure, reduce cost.

Description

Spatial information capture device
Technical field
The utility model is about a kind of spatial information capture device, particularly sends pattern and camera model intercepting pattern about one through a structured light generation module, and then the spatial information capture device of perception one spatial information.
Background technology
In recent years, along with the evolution of electronics industry and the flourish of industrial technology, the trend of various electronic device design and exploitation, gradually towards future development that is light, that be easy to carry about with one, is applied to the purposes such as Mobile business, amusement or leisure whenever and wherever possible in order to user.For example, image capture unit miscellaneous is just being widely used in various field, such as the portable electronic devices such as intelligent mobile phone, Wearable electronic installation, and it has the little and advantage be convenient for carrying of volume.
Moreover, along with the lifting of quality of life, people have more demand to the image that image capture unit captures, for example, people wish that obtained image can be 3D stereopsis, and this 3D stereopsis includes the degree of depth (depth) information accurately, more for example, people wish that portable electronic devices has the function of distance measuring, and then can carry out gesture identification.And the measurement of relevant depth information or the measurement of distance, can pass through flight time span method (TimeofFlight, TOF) or twin-lens (dualcamera) span method etc. at present obtains.
But, although the measurement that flight time span method obtains has preferably accuracy, but it is very complicated in software computing when it is as being generalized to face or the application of multiple spot feelings mirror, need to introduce certain operations chip and integrated circuit (IC) more, therefore power consumption is large, computing cost is also high.In addition, flight time span method is also because being easily subject to the impact of ambient brightness, and an obtainable measurement accuracy is lower when ambient light evil is large.As for twin-lens span method, it also has the complicacy of certain degree in software computing, comparatively not simple and easy, and power consumption and computing cost are because use bimirror, though to have superiority existence compared to time-of-flight method, but the performance of its span to smooth surface is poor, the shortcoming namely having the measurement accuracy that obtains smooth surface lower exists.
In view of this, for providing a kind of microminiaturization and well can obtaining the spatial information capture device of a space length information of target object, for this reason technical field need the target reached badly.
Utility model content
The technical problems to be solved in the utility model is, for prior art above shortcomings, provide a kind of spatial information capture device that well can obtain a space length information of target object, structure is simple, be conducive to the processing procedure of simplify processes control module, reduce costs.
The utility model solves the technical scheme that its technical matters adopts and is to provide a kind of spatial information capture device, comprises structured light generation module, camera model and processing and control element (PCE).This structured light generation module sends a structured light pattern (structuredlightingpattern) to a target object, makes this target object have an estimation pattern.This camera model comprises lens combination, pumped FIR laser part (opticallycodedelement) and sensing unit (sensor), this pumped FIR laser part has a benchmark pattern, between this benchmark pattern with this estimation pattern, there is at least part of corresponding drawing feature, and this benchmark pattern and this estimation pattern are all projected to this sensing unit.This processing and control element (PCE) signal is connected to this sensing unit, the drawing feature difference between this benchmark pattern of this processing and control element (PCE) comparison and this estimation pattern, to judge a space length information of this estimation target object.
Preferably, this lens combination has an optical axis and multiple lens, this pumped FIR laser part and the plurality of lens are along this optical axis sequential, make this estimation pattern be projeced into through being positioned at the plurality of lens of this optical axis this sensing unit being positioned at this optical axis, and make this benchmark pattern being positioned at this optical axis be projeced into this sensing unit being positioned at this optical axis; Wherein, this pumped FIR laser part is attached to this lens combination, or this pumped FIR laser part is located away from this lens combination.
Preferably, this lens combination has multiple lens, this estimation pattern is through one first viewpoint position of the plurality of lens projects in this sensing unit, this benchmark pattern is projeced into one second viewpoint position of this sensing unit, and this first viewpoint position is different from this second viewpoint position (differentapertureforviewing); Wherein, this pumped FIR laser part is attached to this lens combination, or this pumped FIR laser part is located away from this lens combination.
Preferably, this lens combination has a primary optic axis, one second optical axis, a prism and multiple lens, this prismatic refraction separates this primary optic axis and this second optical axis, make this estimation pattern be projeced into through being positioned at multiple lens of this primary optic axis this sensing unit being positioned at this primary optic axis, and make this benchmark pattern being positioned at this second optical axis be projeced into through this prism this sensing unit being positioned at this primary optic axis; Wherein, this pumped FIR laser part is attached to this lens combination, or this pumped FIR laser part is located away from this lens combination.
Preferably, this pumped FIR laser part is autoluminescence parts; Or this pumped FIR laser part is pumped FIR laser film.
Preferably, this pumped FIR laser part combined by multiple liquid crystal structure, and the plurality of liquid crystal structure by a programmed cell control and management to present this benchmark pattern.
Preferably, the optical wavelength of this pumped FIR laser part is different from the optical wavelength of the structured light that this structured light generation module sends.
Preferably, this camera model also comprises housing and adjusting mechanism, this adjusting mechanism part is revealed in outside this housing, wherein, this adjusting mechanism is linked in this pumped FIR laser part, this pumped FIR laser part is driven by this adjusting mechanism, and adjusts the relative position between this pumped FIR laser part and this lens combination for a user.
Preferably, this structured light generation module comprises light emitting source, collimation lens and diffraction optical element, and this structured light pattern that this structured light generation module sends is corresponding with this estimation pattern being presented in this target object; Wherein, this light emitting source comprises laser diode (LD), light emitting diode (LED), Organic Light Emitting Diode (OLED) and in order to export at least one that has in the luminescence unit of the light beam of a thermoinduction range of wavelengths.
Preferably, this structured light pattern comprises at least one in palisade pattern, divergent radiation shape pattern, many dot patterns and symmetry/asymmetric shape pattern.
Preferably, this structured light generation module and this processing and control element (PCE) be dynamic interlock each other, makes this processing and control element (PCE) should estimate the dynamic change of pattern and perform regulation and control to this structured light generation module.
Preferably, this light emitting source of this structured light generation module is adjusted variation, and this estimation pattern is changed thereupon.
Preferably, this diffraction optical element of this structured light generation module is adjusted variation, and this estimation pattern is changed thereupon.
Spatial information capture device of the present utility model is provided with pumped FIR laser part in camera model collocation, the benchmark pattern that this pumped FIR laser part projects and structured light generation module irradiates on target object one estimates pattern each other, there is at least part of corresponding pattern content relation, and this benchmark pattern and this estimation pattern can be projeced into sensing unit with the time, for the difference of this pattern content therebetween of processing and control element (PCE) comparison, judge the space length information of this target object, and again without the need to the content of benchmark pattern is built in advance in processing and control element (PCE), thus, not only additionally save setup time and human cost, more can the processing procedure of simplify processes control module, as simplified the algorithm of processing and control element (PCE) and improve the speed of computing comparison, reduce computing cost.
Accompanying drawing explanation
Fig. 1 is the conceptual schematic view of the utility model spatial information capture device estimation one target object to be estimated.
Fig. 2 is the conceptual schematic view of the camera model of the utility model first embodiment.
Fig. 3 is the conceptual schematic view of the camera model of the utility model second embodiment.
Fig. 4 is the conceptual schematic view of the camera model of the utility model the 3rd embodiment.
Embodiment
Refer to Fig. 1, Fig. 1 is the conceptual schematic view of the utility model spatial information capture device estimation one target object to be estimated.
The spatial information of the utility model spatial information capture device 1 fechtable one target object, here the spatial information of indication includes: the spatial information of the relative height depth of target object surface and the distance between target object and spatial information capture device etc., and these spatial informations have very large help for setting up 3D stereopsis.Set up the detailed computing method of 3D stereopsis as rear end, because of non-the utility model emphasis, just do not repeat in this.
The utility model spatial information capture device 1 comprises structured light generation module 11, camera model 12 and a processing and control element (PCE) 13.When the spatial information of user's wish acquisition target object 9, can by the structured light generation module 11 of spatial information capture device 1, send a structured light pattern (structuredlightingpattern) to target object 9, and on target object 9, present an estimation pattern 113b.As for the use that estimation pattern 113b is then the spatial information as detection target object 9, the details will be described later.
First structured light generation module 11 is introduced.Structured light generation module 11 comprises a light emitting source 111, collimating lens 112 and a diffraction optical element 113.The exportable multiple light beam of light emitting source 111, and collimation lens 112 is arranged between light emitting source 111 and diffraction optical element 113, its function is the multiple light beam of collimation, makes multiple light beam be incident to diffraction optical element 113.It is formed thereon that diffraction optical element 113 has pattern, when multiple light beam is by collimation lens 112 and by diffraction optical element 113, and structured light generation unit 1, export structure optical pattern 113a.
Specifically, light emitting source 111 can be selected from a laser diode (LD), a light emitting diode (LED), an Organic Light Emitting Diode (OLED) and have one of them of a luminescence unit of the light beam of a thermoinduction range of wavelengths in order to output.As for the wherein at least one that structured light pattern 113a can be then palisade pattern, divergent radiation shape pattern, many dot patterns and symmetry/asymmetric shape pattern, but this is only a citing, does not do a restriction in this.
But when structured light L exposes to target object 9, structured light L can because of the relative exposure angle between structured light generation module 11 and target object 9, or because of the surface of target object 9 fluctuating height, the estimation pattern 113b presented when making structured light L irradiate target object 9 surperficial, slightly distortion and different from original structured light pattern 113a, but still possess corresponding content at least partly between the two, such as: skew or angle of inclination, the position of corresponding point or lines, the size of corresponding point, the thickness of corresponding lines, corresponding line length, corresponding line orientations, corresponding lines curvature etc. relation.For the structured light pattern 113a painted in the utility model Fig. 1 and estimation pattern 113b, because of irradiating angle relation, the parallel palisade lines of structured light pattern 113a originally, estimation pattern 113b have been out of shape and have become the wide not parallel lines in the left narrow right side.But this only enumerates for ease of one of explanation, and is not limited thereto.
Next camera model 12 is introduced.The utility model camera model 12 comprises lens combination 121, pumped FIR laser part (opticallycodedelement) 122 and a sensing unit 123.One of characteristic of the present utility model is that pumped FIR laser part 122 has a benchmark pattern 122a, the pattern of benchmark pattern 122a is preferably identical or is comparable to structured light pattern 113a in design, the object that pumped FIR laser part 122 is arranged is to provide benchmark pattern 122a and benchmark pattern 122a is projected towards the sensing unit 123 of camera model 12, make sensing unit 123 to sense benchmark pattern 122a (this is first pattern be projeced on sensing unit 123), as the benchmark of comparison.On the other hand, leading portion describes on the outside surface of described target object 9, the estimation pattern 113b presented also can be captured by sensing unit 123 perception of camera model 12, also therefore sensing unit 123 can sense estimation pattern 113b (this is second pattern be projeced on sensing unit 123).Thereafter, again through the processing and control element (PCE) be connected with sensing unit 123 signal 13 couples of benchmark pattern 122a (first pattern) and estimate pattern 113b (second pattern) and perform drawing feature comparison, so can calculate the spatial information of target object 9, be more than the main concept of the utility model execution.
What must first illustrate is, benchmark pattern 122a does not really want identical with estimation pattern 113b, only need to possess a part to exist together mutually, namely between benchmark pattern 122a with estimation pattern 113b, there is at least part of corresponding drawing feature, can permeation parts pattern something in common implement according to this as the unique point of comparison.In addition, structured light generation module 11 and processing and control element (PCE) 13 be dynamic interlock each other, makes processing and control element (PCE) 13 should estimate the dynamic change of pattern 113b and automatically or passively to perform regulation and control to structured light generation module 11.For example, the light emitting source 111 of structured light generation module 11, by adjustment variation, makes estimation pattern 113b change thereupon; Or the diffraction optical element 113 of structured light generation module 11, by adjustment variation, makes estimation pattern 113b change thereupon.The content of more detailed acquisition operating mechanism and camera model, details are as follows.
Following three kinds of different cameral modules all belong to embodiment of the present utility model, the benchmark pattern of pumped FIR laser part can be made and be presented in estimation pattern on target object all well catch by the sensing unit of camera model, compare for processing and control element (PCE), and then calculate the spatial information of target object.
Fig. 2 is the conceptual schematic view of the camera model of the utility model first embodiment.In the first embodiment, camera model 22 comprises lens combination 221, pumped FIR laser part (opticallycodedelement) 222 and a sensing unit 223, and wherein lens combination 221, pumped FIR laser part 222 and sensing unit 223 are along an optical axis X sequential.Lens combination 221 comprises multiple lens 221a, and the estimation pattern 113b that target object 9 presents court can be positioned at sensing unit 223 projection imaging on optical axis X by multiple lens 221a, and then sensing unit 223 pick-up image.On the other hand, the benchmark pattern 222a on pumped FIR laser part 222 can pass through convex lens 225 and similarly towards sensing unit 223 projection imaging, and then sensing unit 223 pick-up image.And, because lens combination in the present embodiment 221, pumped FIR laser part 222 and sensing unit 223 sequentially arrange arrangement along optical axis X, therefore by the estimation pattern 113b that captures with had at least part of image by the benchmark pattern 222a captured and be stacked, thereafter again through processing and control element (PCE) 23 computing, the spatial information of target object 9 can be got.
In the present embodiment, sensing unit 223 can be clearly imaged in order to make benchmark pattern 222a, convex lens 225 are arranged between pumped FIR laser part 222 and sensing element 223, preferably, distance between pumped FIR laser part 222 and convex lens 225, is same as the distance between convex lens 225 and sensing unit 223.As for the setting of pumped FIR laser part 222, the form of lens combination 221 can be attached to, or can the pumped FIR laser part 222a form that is located away from lens combination 221 arrange, not be restricted in this.
Furthermore, the utility model pumped FIR laser part 222 has following two kinds to implement aspect.The first enforcement aspect is pumped FIR laser part 222 is autoluminescence parts, for example, it combined by multiple liquid crystal structure, and those liquid crystal structures can by a programmed cell (not shown) control and management with its pattern that can present of control and management.Moreover, for reaching the object obtaining spatial information, this programmed cell control and management pumped FIR laser part 222, make the benchmark pattern 222a that pumped FIR laser part 222 presents, correspond to the structured light pattern that structured light generation module produces, therefore benchmark pattern 222a will correspond to the estimation pattern 113b on target object 9 certainly.It is pumped FIR laser part 222 is a pumped FIR laser film that the second implements aspect, extraneous light enter camera model 22 and by this pumped FIR laser film after, the benchmark pattern 222a on this pumped FIR laser film can projection imaging on sensing unit 223.
Moreover, the estimation pattern 113b be stacked captured for making sensing unit 223 and the degree of distinguished of benchmark pattern 222a improve, can be designed to that there is different optical wavelength by by pumped FIR laser part 222 and structured light L further, identifiability between two projection patterns on sensing unit 223 is increased, and then improves computing accuracy.
On the other hand, for making the benchmark pattern 222a deflection be projected on sensing unit 223 on pumped FIR laser part 222 be easy to adjustment, camera model 22 more comprises housing 224 and an adjusting mechanism 226, and adjusting mechanism 226 part is revealed in outside housing 224.Wherein, adjusting mechanism 226 is linked in pumped FIR laser part 222, therefore when a user transfers adjusting mechanism 226, pumped FIR laser part 222 can with the action of adjusting mechanism 226 up and down/left and right/movable, by this to adjust the relative position between pumped FIR laser part 222 and lens combination 221/ sensing unit 223.
Fig. 3 is the conceptual schematic view of the camera model of the utility model second embodiment.The element composition of the second embodiment is similar in appearance to the first embodiment, and it differs from the first embodiment part and is mainly, the relative position between pumped FIR laser part 322 and lens combination 321 is furnished.In the second embodiment, camera model 32 comprises lens combination 321, pumped FIR laser part 322 and a sensing unit 323.Lens combination 321 comprises multiple lens 321a, the estimation pattern 113b that target object 9 presents can by be positioned at multiple lens 321a of primary optic axis X1 and projection imaging to one first viewpoint position P1 of sensing unit 323, and then sensing unit 323 pick-up image.On the other hand, the benchmark pattern 322a on pumped FIR laser part 322 be convex lens 325 by being positioned at the second optical axis X2 and projection imaging to one second viewpoint position P2 of sensing unit 323, and then sensing unit 323 pick-up image.
Wherein, primary optic axis X1 is preferably and be arranged in parallel with the second optical axis X2 or close to be arrangeding in parallel, therefore the first viewpoint position P1 can be different from this second viewpoint position P2 (differentapertureforviewing).And being set up in parallel by lens combination 321 and pumped FIR laser part 322, the pumped FIR laser part 322 estimation pattern 113b that can not interfere with on target object 9 is projected to the imaging process of sensing unit 323.By the estimation pattern 113b that captures with more can be carried out image through processing and control element (PCE) 33 further by the benchmark pattern 322a captured and mutually emulate superimposed, or directly calculate the similarities and differences part between pattern, and then obtain the spatial information of target object 9.
As in the second embodiment, sensing unit 323 can be clearly imaged in for making benchmark pattern 322a, also convex lens 325 are arranged between pumped FIR laser part 322 and sensing unit 323, and the distance between pumped FIR laser part 322 and convex lens 325, be preferably the distance be same as between convex lens 325 and sensing unit 323.For making the benchmark pattern 322a deflection be projected on sensing unit 323 on pumped FIR laser part 322 be easy to adjustment, camera model 32 is also configured with an adjusting mechanism 326, for the relative position between adjustment pumped FIR laser part 322 and sensing unit 323.
Fig. 4 is the conceptual schematic view of the camera model of the utility model the 3rd embodiment.The element composition of the 3rd embodiment is similar in appearance to the first embodiment, and it differs from the first embodiment part and is mainly, be different from except front two embodiments except the relative position between pumped FIR laser part and lens combination is furnished, more additional is configured with prism.Specifically, in the present embodiment, camera model 42 comprises a lens combination 421, one pumped FIR laser part 422 and a sensing unit 423, wherein, lens combination 421 has a primary optic axis X1 ', one second optical axis X2 ', a multiple lens 421a and prism 421b, prism 421b refraction separates primary optic axis X1 ' and the second optical axis X2 ', therefore the estimation pattern 113b that target object 9 presents can well image in via the multiple lens 421a and prism 421b being positioned at primary optic axis X1 ' sensing unit 423 being positioned at primary optic axis X1 ', and then sensing unit 423 pick-up image.On the other hand, be arranged at the benchmark pattern 422a on the pumped FIR laser part 422 on the second optical axis X2 ', can reflex to through prism 421b the sensing unit 423 being positioned at primary optic axis X1 '.
Wherein, primary optic axis X1 ' is preferably arrange or close to vertical setting vertical with the second optical axis X2 ', and the setting of primary optic axis X1 ' is deviated from by pumped FIR laser part 422, the pumped FIR laser part 422 estimation pattern 113b that can not interfere with on target object 9 is projected to the imaging process of sensing unit 423.Through the relative position designing arranged prism 421b, lens combination 421 and pumped FIR laser part 422 in the present embodiment, by the estimation pattern 113b that captures with can be stacked at least partly by the benchmark pattern 422a captured, the spatial information of target object 9 can be calculated thereafter again through processing and control element (PCE) 43.
As in the 3rd embodiment, adjustment is easy to for ease of the benchmark pattern 422a deflection be projected on sensing unit 423 on pumped FIR laser part 422, camera model 42 may be configured with an adjusting mechanism 426, for the relative position between adjustment pumped FIR laser part 422 and prism 421b.
In sum, spatial information capture device of the present utility model is arranged in pairs or groups especially and is provided with pumped FIR laser part, and the estimation pattern on the benchmark pattern of pumped FIR laser part and target object can be projeced into sensing unit with the time, the space length information of target object is calculated for processing and control element (PCE) process, and again the content of benchmark pattern needn't be built in advance in processing and control element (PCE), not only additionally save setup time and human cost, more can simplify the algorithm of processing and control element (PCE) and improve the speed of computing comparison.
Above-described embodiment is only exemplary illustration principle of the present utility model and effect thereof, and explaination the technical solution of the utility model, but not for limiting protection domain of the present utility model.Any the art those of ordinary skill all can all belong to the utility model and advocates scope without prejudice to the arrangement of unlabored change when know-why of the present utility model and spirit or isotropism.Therefore, rights protection scope of the present utility model should listed by its right.

Claims (13)

1. a spatial information capture device, is characterized in that, comprising:
Structured light generation module, sends structured light pattern to target object, makes this target object have an estimation pattern;
Camera model, comprise lens combination, pumped FIR laser part and sensing unit, this pumped FIR laser part has a benchmark pattern, has at least part of corresponding drawing feature between this benchmark pattern with this estimation pattern, and this benchmark pattern and this estimation pattern are all projected to this sensing unit; And
Processing and control element (PCE), signal is connected to this sensing unit, the drawing feature difference between this benchmark pattern of this processing and control element (PCE) comparison and this estimation pattern, to judge a space length information of this estimation target object.
2. spatial information capture device as claimed in claim 1, it is characterized in that, this lens combination has an optical axis and multiple lens, this pumped FIR laser part and the plurality of lens are along this optical axis sequential, make this estimation pattern be projeced into through being positioned at the plurality of lens of this optical axis this sensing unit being positioned at this optical axis, and make this benchmark pattern being positioned at this optical axis be projeced into this sensing unit being positioned at this optical axis; Wherein, this pumped FIR laser part is attached to this lens combination, or this pumped FIR laser part is located away from this lens combination.
3. spatial information capture device as claimed in claim 1, it is characterized in that, this lens combination has multiple lens, this estimation pattern is through one first viewpoint position of the plurality of lens projects in this sensing unit, this benchmark pattern is projeced into one second viewpoint position of this sensing unit, and this first viewpoint position is different from this second viewpoint position; Wherein, this pumped FIR laser part is attached to this lens combination, or this pumped FIR laser part is located away from this lens combination.
4. spatial information capture device as claimed in claim 1, it is characterized in that, this lens combination has a primary optic axis, one second optical axis, a prism and multiple lens, this prismatic refraction separates this primary optic axis and this second optical axis, make this estimation pattern be projeced into through being positioned at multiple lens of this primary optic axis this sensing unit being positioned at this primary optic axis, and make this benchmark pattern being positioned at this second optical axis be projeced into through this prism this sensing unit being positioned at this primary optic axis; Wherein, this pumped FIR laser part is attached to this lens combination, or this pumped FIR laser part is located away from this lens combination.
5. spatial information capture device as claimed in claim 1, it is characterized in that, this pumped FIR laser part is autoluminescence parts; Or this pumped FIR laser part is pumped FIR laser film.
6. spatial information capture device as claimed in claim 1, it is characterized in that, this pumped FIR laser part combined by multiple liquid crystal structure, and the plurality of liquid crystal structure by a programmed cell control and management to present this benchmark pattern.
7. spatial information capture device as claimed in claim 1, it is characterized in that, the optical wavelength of this pumped FIR laser part is different from the optical wavelength of the structured light that this structured light generation module sends.
8. spatial information capture device as claimed in claim 1, it is characterized in that, this camera model also comprises housing and adjusting mechanism, this adjusting mechanism part is revealed in outside this housing, wherein, this adjusting mechanism is linked in this pumped FIR laser part, and this pumped FIR laser part is driven by this adjusting mechanism, and adjusts the relative position between this pumped FIR laser part and this lens combination for a user.
9. spatial information capture device as claimed in claim 1, it is characterized in that, this structured light generation module comprises light emitting source, collimation lens and diffraction optical element, and this structured light pattern that this structured light generation module sends is corresponding with this estimation pattern being presented in this target object; Wherein, this light emitting source comprises laser diode, light emitting diode, Organic Light Emitting Diode and in order to export at least one that has in the luminescence unit of the light beam of a thermoinduction range of wavelengths.
10. spatial information capture device as claimed in claim 9, it is characterized in that, this structured light pattern comprises at least one in palisade pattern, divergent radiation shape pattern, many dot patterns and symmetry/asymmetric shape pattern.
11. spatial information capture devices as claimed in claim 9, it is characterized in that, this structured light generation module and this processing and control element (PCE) be dynamic interlock each other, makes this processing and control element (PCE) should estimate the dynamic change of pattern and perform regulation and control to this structured light generation module.
12. spatial information capture devices as claimed in claim 11, it is characterized in that, this light emitting source of this structured light generation module is adjusted variation, and this estimation pattern is changed thereupon.
13. spatial information capture devices as claimed in claim 11, it is characterized in that, this diffraction optical element of this structured light generation module is adjusted variation, and this estimation pattern is changed thereupon.
CN201520934812.3U 2015-11-20 2015-11-20 Spatial information capture device Active CN205193274U (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201520934812.3U CN205193274U (en) 2015-11-20 2015-11-20 Spatial information capture device
US14/996,785 US9977305B2 (en) 2015-11-20 2016-01-15 Spatial information capturing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201520934812.3U CN205193274U (en) 2015-11-20 2015-11-20 Spatial information capture device

Publications (1)

Publication Number Publication Date
CN205193274U true CN205193274U (en) 2016-04-27

Family

ID=55786253

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201520934812.3U Active CN205193274U (en) 2015-11-20 2015-11-20 Spatial information capture device

Country Status (1)

Country Link
CN (1) CN205193274U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108242072A (en) * 2016-12-23 2018-07-03 捷西迪光学(开曼)股份有限公司 Establish the method for space map and the patch graphical set applied to this method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108242072A (en) * 2016-12-23 2018-07-03 捷西迪光学(开曼)股份有限公司 Establish the method for space map and the patch graphical set applied to this method

Similar Documents

Publication Publication Date Title
CN106055172B (en) Optical navigation chip, optical navigation module and optical encoder
JP6854366B2 (en) Detector that optically detects at least one object
US11889046B2 (en) Compact, low cost VCSEL projector for high performance stereodepth camera
CN106716059B (en) Detector for optically determining the position of at least one object
US11461908B2 (en) Image processing method and apparatus, and image processing device using infrared binocular cameras to obtain three-dimensional data
CN106296716A (en) The power regulating method of light source, depth measurement method and device
KR102113752B1 (en) Staggered array of light-emitting elements to sweep the angular range
CN206892529U (en) Structured light generators
US20130002859A1 (en) Information acquiring device and object detecting device
CN207460318U (en) Convenient for fixed optics module
US20130010292A1 (en) Information acquiring device, projection device and object detecting device
CN205283683U (en) Optical device
CN208297850U (en) A kind of light source module group, image acquiring device, identity recognition device and electronic equipment
JPWO2012147495A1 (en) Information acquisition device and object detection device
CN108169757A (en) Center pixel high-precision identification light measurement system and method
CN205193274U (en) Spatial information capture device
CN208569285U (en) Projective module group, electrooptical device and electronic equipment
CN106289092A (en) Optical devices and light-emitting device thereof
CN100359286C (en) Method for improving laser measuring accuracy in image processing
CN106911877A (en) Optical devices
CN217135613U (en) Structured light system and electronic equipment with same
US20160366395A1 (en) Led surface emitting structured light
CN101114027A (en) Compound eye lens detector
CN104406546B (en) The method that Reference Transforming is realized using removable graticle
CN210570528U (en) Depth detection system, bracket thereof and electronic device

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20200107

Address after: PO box 31119 Furong Bay, 802 West Bay Road, ky1-1205 Grand Cayman, Cayman Islands

Patentee after: Gaozhun International Technology Co., Ltd

Address before: 1 Donger street, Nanzi processing district, Nanzi District, Kaohsiung City

Patentee before: Everready Precision Ind. Corp.

TR01 Transfer of patent right