CN107884066A - Optical sensor and its 3D imaging devices based on flood lighting function - Google Patents
Optical sensor and its 3D imaging devices based on flood lighting function Download PDFInfo
- Publication number
- CN107884066A CN107884066A CN201710908882.5A CN201710908882A CN107884066A CN 107884066 A CN107884066 A CN 107884066A CN 201710908882 A CN201710908882 A CN 201710908882A CN 107884066 A CN107884066 A CN 107884066A
- Authority
- CN
- China
- Prior art keywords
- light beam
- optical
- light
- unit
- imaging devices
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003287 optical effect Effects 0.000 title claims abstract description 135
- 238000003384 imaging method Methods 0.000 title claims abstract description 32
- 238000001514 detection method Methods 0.000 claims abstract description 37
- 238000000034 method Methods 0.000 claims description 9
- 230000004927 fusion Effects 0.000 abstract description 4
- 238000013459 approach Methods 0.000 description 11
- 238000010586 diagram Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 239000000758 substrate Substances 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 3
- 238000007689 inspection Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000006854 communication Effects 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 239000002184 metal Substances 0.000 description 2
- 229910052751 metal Inorganic materials 0.000 description 2
- 150000002739 metals Chemical class 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 239000000919 ceramic Substances 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 230000005622 photoelectricity Effects 0.000 description 1
- 239000004033 plastic Substances 0.000 description 1
- 229920003023 plastic Polymers 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 239000012780 transparent material Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J1/00—Photometry, e.g. photographic exposure meter
- G01J1/42—Photometry, e.g. photographic exposure meter using electric radiation detectors
- G01J1/4204—Photometry, e.g. photographic exposure meter using electric radiation detectors with determination of ambient light
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
- G01B11/026—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring distance between sensor and object
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Sustainable Development (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a kind of 3D imaging devices, including:Module is projected, for projective patterns light beam;Optical sensor, including:For launching the first Optical Transmit Unit of the first light beam, for launching the second Optical Transmit Unit of the second light beam, and optical detecting unit, for detecting first light beam with judgment object close to distance;First collection module, the first object image irradiated by the patterned beam, and the second target image that collection is irradiated by second light beam in the second light beam described in second light emitting units emitting are gathered when the projection module projects the patterned beam.It is used for the detection of object proximity by the first light beam;Second light beam is gathered by the collection module in 3D sensors, for providing floodlighting for 3D sensors, realizes the fusion superposition of the multiple functions such as proximity detection, floodlight imaging.
Description
Technical field
The present invention relates to optics and electronic technology field, more particularly to a kind of optical sensor and its 3D based on flood lighting function
Imaging device.
Background technology
Optical sensor has been widely used in intelligent terminal such as mobile phone, flat board, apparatus such as computer.In mobile terminal device
In, the intensity that ambient light sensor is used to detect ambient light is so as to further realize the adjust automatically to screen intensity, close to biography
Sensor can detect whether that object close to equipment, particularly in the equipment such as mobile phone, needs to lean on mobile phone when receiving calls
Nearly face, proximity transducer can then detect this phenomenon so as to carry out the control such as breath screen to screen, prevent face to screen
False touch.
3D sensors will be also applied in intelligent terminal, be based particularly on the 3D sensors of structured light technique,
Three-dimensional measurement to face can be carried out using it, further realizes a series of work(such as three-dimensional face modeling, 3D recognitions of face
Can, wherein 3D recognitions of face can improve the biological identification technology higher than fingerprint recognition precision.
As the demand to smart machine functionally is more and more, to the functional requirement of optical sensor also more and more higher, pass
The optical sensor of the simple function of system is often difficult to meet demand.For proximity transducer, often it is only capable of to object closely
Whether close to being detected, detecting distance and scope are single.For 3D sensors, often by projection module and collection module
Composition, occupies the more device space, but is only capable of providing 3D imaging capabilities;And 3D sensors of the prior art, can not be right
Target carries out floodlight imaging, thus when carrying out recognition of face at night, it is difficult to realize high-precision identification.The increase of demand causes
Smart machine mechanically increases corresponding sensor, and the effect reached also tends to be simple superposition functionally, is not reaching to
Integrated fusion is to realize the effect of more kinds of functions.
The content of the invention
To solve the above problems, the present invention proposes a kind of 3D imaging devices of High Density Integration, it can realize 3D sensors
With the combination of optical sensor, optical sensor also provides floodlighting in addition to it can carry out proximity detection for 3D sensors
Effect, with realize the fusion of multiple functions be superimposed.
3D imaging devices provided by the invention, including:Module is projected, for projective patterns light beam;Optical sensor, bag
Include:For launching the first Optical Transmit Unit of the first light beam, for launching the second Optical Transmit Unit of the second light beam, and light inspection
Unit is surveyed, for detecting first light beam with judgment object close to distance;First collection module, thrown in the projection module
The first object image irradiated by the patterned beam is gathered when penetrating the patterned beam, and is launched in second light
The second target image that collection is irradiated by second light beam when unit launches second light beam.
In certain embodiments, first object image is spot image, and the second target image is floodlight image.
In certain embodiments, the wavelength of patterned beam is identical with the wavelength of the second light beam so that first object image
It can be gathered with the second target image by the first collection module.
In certain embodiments, the wavelength of patterned beam is different from the wavelength of the second light beam, and the first collection module quilt
The filter plate of two kinds of wavelength is configured to realize the collection of first object image and the second target image.
In certain embodiments, the wavelength of the first light beam is different from the wavelength of the second light beam.
In certain embodiments, optical detecting unit is additionally operable to detect the intensity of ambient light.
In certain embodiments, optical detecting unit first Optical Transmit Unit when detecting ambient light is closed.Enter
And optical detecting unit detects first light beam and ambient light when first Optical Transmit Unit is opened, and it is poor using time domain
Point-score detects first light beam.
In certain embodiments, the power of the first Optical Transmit Unit is less than the second Optical Transmit Unit.
In certain embodiments, the angle of departure of the first Optical Transmit Unit is less than the angle of departure of the second Optical Transmit Unit.
In certain embodiments, the first Optical Transmit Unit and the integrated Optical Transmit Unit of the second Optical Transmit Unit, and one
Body Optical Transmit Unit includes being made up of multiple sub-light sources and being grouped the array of source of control.
In certain embodiments, the sub-light source quantity corresponding to the first Optical Transmit Unit is less than the second Optical Transmit Unit pair
The sub-light source quantity answered.
In certain embodiments, the first Optical Transmit Unit and the second Optical Transmit Unit are VCSEL or VCSEL array.
In certain embodiments, the 3D imaging devices also include RGB camera or receiver.
Beneficial effects of the present invention:By setting two Optical Transmit Units, the first Optical Transmit Unit hair in optical sensor
Penetrate the detection that the first light beam is used for object proximity, second the second light beam of light emitting units emitting, for by 3D sensors
Collection module is gathered, so as to for 3D sensors provide floodlighting, the 3D imaging devices realize proximity detection, floodlight into
The fusion superposition of the multiple functions such as picture.
Brief description of the drawings
Fig. 1 is the optical sensor structural representation of one embodiment of the invention.
Fig. 2 is the device arrangements figure of optical sensor in the prior art.
Fig. 3 is the device arrangements figure of the optical sensor of one embodiment of the invention.
Fig. 4 a are the beam emissions schematic diagram of one embodiment of the invention.
Fig. 4 b are the beam emissions schematic diagram of one embodiment of the invention.
Fig. 5 is the device arrangements figure of the optical sensor of one embodiment of the invention.
Fig. 6 is the 3D imaging device schematic diagrames of one embodiment of the invention.
Embodiment
In order that technical problem to be solved of the embodiment of the present invention, technical scheme and beneficial effect are more clearly understood,
Below in conjunction with drawings and Examples, the present invention will be described in further detail.It should be appreciated that specific implementation described herein
Example is not intended to limit the present invention only to explain the present invention.
It should be noted that when element is referred to as " being fixed on " or " being arranged at " another element, it can be directly another
On one element or it is connected on another element.When an element is known as " being connected to " another element, it can
To be directly to another element or be indirectly connected on another element.In addition, connection can be used to fix
Effect can also be used to circuit communication act on.
It is to be appreciated that term " length ", " width ", " on ", " under ", "front", "rear", "left", "right", " vertical ",
The orientation or position relationship of the instruction such as " level ", " top ", " bottom " " interior ", " outer " are to be closed based on orientation shown in the drawings or position
System, it is for only for ease of and describes the embodiment of the present invention and simplify description, rather than the device or element of instruction or hint meaning must
There must be specific orientation, with specific azimuth configuration and operation, therefore be not considered as limiting the invention.
In addition, term " first ", " second " are only used for describing purpose, and it is not intended that instruction or hint relative importance
Or the implicit quantity for indicating indicated technical characteristic.Thus, define " first ", the feature of " second " can be expressed or
Implicitly include one or more this feature.In the description of the embodiment of the present invention, " multiple " are meant that two or two
More than, unless otherwise specifically defined.
Fig. 1 is the schematic diagram of optical sensor according to an embodiment of the invention.Optical sensor 10 includes substrate 101, outer
With gear 106 of passing, wherein substrate 101 is PCB, is launched to light for shell 102, Optical Transmit Unit 104, optical detecting unit 105
Unit 104 and optical detecting unit 105 provide support and electrical connection, and substrate 101 can be other any types, such as flexible
Circuit board (FPC), Rigid Flex, or formed with combinations of materials such as other metals, ceramics.In one embodiment, substrate
101 be semi-conducting material, and Optical Transmit Unit 104 can be generated directly with optical detecting unit 105 on semi-conducting material.Light is launched
Unit 104 is used to launch light beam 109, including the luminescent device such as LED, laser diode, and in one embodiment, light beam 109 is red
Outer invisible light beam.Optical detecting unit 105, optical detecting unit are reflexed to light beam 110 when light beam 109 is irradiated to target 20
105 judge the degree of approach of target object, i.e., close distance according to the light beam detected.Optical detecting unit 105 includes photoelectricity
The light receiving elements such as diode, phototransistor, imaging sensor.
Degree of approach judgement generally comprises two ways, and a kind of is the light beam that optical detecting unit 105 is received by detection
Intensity estimates the distance of target, and this mode is generally used for determining whether object close to equipment, without accurate measurement object
Distance, such as the proximity transducer in the equipment such as mobile phone, when in communication process face close to when, optical detecting unit 105
The luminous intensity increase of reception, when more than certain threshold value, then it is assumed that face sufficiently closes to mobile phone, now closes the touch-control of screen
Function and breath screen.Another way is that the light beam sent by measuring Optical Transmit Unit 104 receives to optical detecting unit 105
Light beam time, i.e., according to time flight method (TOF) come the distance of accurate measurement target.Optical detecting unit 105 can be single
Individual photodiode, or multiple photodiodes, in one embodiment, it can be the image sensing of more pixels
Device.It should be noted that either which kind of method, the detection of the degree of approach is completed by extra processor, such as hand
Processor in machine to the data transmit-receive of Optical Transmit Unit 104 and optical detecting unit 105 by realizing that the degree of approach judges, place
It can also be by the sensor application specific processor of itself to manage device.
In order to which the light beam for preventing Optical Transmit Unit 104 from launching is directly entered optical detecting unit 105 without target 20, so as to
Error is caused, lattice shelves 106 are set between for this.Be respectively provided with shell 102 corresponding to Optical Transmit Unit with
And the window 107,108 of optical detecting unit, shell qualifying shelves can be made up of materials such as plastics, metals.By the sensor integration
During into terminal device, such as when in mobile phone, often it is placed on below screen 103, screen 103 is generally transparent material, prevents
Multiple reflection only is crossed to light beam 109 and 110.
Fig. 2 is the device arrangements figure of the optical sensor in prior art.An Optical Transmit Unit is arranged in substrate 101
104 and optical detecting unit 105.In proximity detection, it is only capable of realizing the detection and judgement to a distance, for example commonly use
Proximity transducer is only realized to the object detection in 10cm, but in some applications, it is also desirable to farther distance or bigger model
The object detection enclosed.
Fig. 3 is the device arrangements figure of optical sensor according to an embodiment of the invention.It is that light is sent out with difference shown in Fig. 2
Penetrate unit 104 to be made up of the first Optical Transmit Unit 1041 and the second Optical Transmit Unit 1042, wherein the first Optical Transmit Unit
1041 transmission powers are relatively low to realize proximity detection closely, such as 0~10cm, and the second Optical Transmit Unit 1042 is launched
Power is higher to realize remote proximity detection, such as 0~40cm.Terminal device can be according to the different need of current application
Seek Optical Transmit Unit corresponding to unlatching.In certain embodiments, Optical Transmit Unit includes vertical cavity surface generating laser (VCSEL)
Or its array, due to VCSEL possesses the advantages that stability high, small volume can be so that sensor be more miniaturized.
In certain embodiments, corresponding optics is provided with for Optical Transmit Unit to adjust with the light beam to transmitting
System, it is as shown in Figs. 4a and 4b.In figure shown in Fig. 4 a, the Optical Transmit Unit 1041 for the detection of Close approach degree is single hair
Optical device, the light beam sent is after the convergence of lens 401 with θ1The angle of departure outwards launch, lens 401 can be placed on shell
At window, light beam can be launched by the convergence effect of lens 401 and more concentrated, avoid examining without target incident to light
Survey in unit.In figure shown in Fig. 4 b, the Optical Transmit Unit 1042 corresponding to remote proximity detection is array of source, is sent
Light after expanding device 402 with bigger emission angle theta2Outwards transmitting is to realize to the larger range of detection of object, here
It can be the optics such as diffraction optical element, diffusing globe to expand device.
Optical Transmit Unit 1041 can also be integrated Optical Transmit Unit with 1042, such as by multiple VCSEL sub-light sources
The array light source of composition, the independent transmission of Optical Transmit Unit 1041 and 1042 is realized by being grouped control in array light source.Such as
Shown in Fig. 5, Optical Transmit Unit 104 is array light source, wherein a small number of sub-light sources, such as a sub-light source 1041 are used to closely connect
Recency detects, and other sub-light sources are used for remote proximity detection.Packet control can be any form, such as independent control
Wherein several sub-light sources are used for Close approach degree when opening and detected, and open other sub-light sources and are examined for the remote degree of approach
Survey, or all light sources are used for remote proximity detection when opening simultaneously.
Optical detecting unit 105 can be single photoelectric diode, or multiple photodiodes.In one embodiment
In, optical detecting unit 105 is single photodiode, and it can carry out the switching inspection of Close approach degree and the remote degree of approach
Survey.In another embodiment, optical detecting unit 105 be more pixels imaging sensor (not shown), this light emission
Unit 1041 and 1042 can select to launch the light beam of different wave length, and now optical detecting unit 105 can be by configuring different ripples
Long filter plate is realized to being detected while two kinds of different wave length light beams.In some embodiments, it is also possible to detected using light
Different subelements in unit 105 carry out proximity detection in a different manner, as utilized the single inspection in optical detecting unit 105
Survey device and intensity detection is carried out to Optical Transmit Unit 1041 with the degree of approach of judgment object, using multiple in optical detecting unit 105
Detector TOF detections are carried out to Optical Transmit Unit 1042 with calculate object close to distance.Similarly, no matter optical detecting unit is adopted
With single photodiode or multiple photodiodes, ring can be realized by time-domain filtering method, such as FD―TD method
The detection of border light.
Due to the illumination of ambient light, actual first light that contains of light beam that above-mentioned optical detecting unit 105 detects launches list
The light beam launched of Optical Transmit Unit 1042 of member 1041 and/or second reflected after the reflected beams and ambient light, be not required to
Under conditions of considering ambient light, its ambient light can be ignored, and the light beam that acquiescence optical detecting unit 105 detects is sent out for the first light
Penetrate the reflected beams after the light beam that the Optical Transmit Unit 1042 of unit 1041 and/or second is launched is reflected.
Under conditions of needing to consider ambient light, sensor 10 can also be used to carry out the intensity detection of ambient light, by light
Detection unit 105 detects the intensity of ambient light and realizes detection to ambient light by processor, and according to testing result realize into
The processing of one step, for example the screen intensity of terminal device is adjusted.
When the first Optical Transmit Unit 1041 and the second Optical Transmit Unit 1042 are closed, optical detecting unit 105 can be carried out
The intensity detection of ambient light.
When carrying out proximity detection, often ambient light illumination be present simultaneously, optical detecting unit 105 is detected simultaneously by ring
The light beam that border luminous intensity and the first Optical Transmit Unit 1041 and/or the second Optical Transmit Unit 1042 are launched reflected after it is anti-
Irradiating light beam intensity, reflected light beam intensities can be obtained by time-domain filtering method to realize proximity detection, such as in one embodiment
It is middle to utilize FD―TD method, i.e., (now the first Optical Transmit Unit 1041 and the second Optical Transmit Unit under ambient light detection pattern
1042 close) detection ambient light intensity, (now the first Optical Transmit Unit 1041 under the proximity detection pattern of subsequent time
And/or second Optical Transmit Unit 1042 open) detection ambient light and the reflected beams overall strength, be by way of time-domain difference
The intensity detection to the reflected beams can be achieved.
When, containing power and during larger the second Optical Transmit Unit of the angle of departure, the light source can be with Optical Transmit Unit 104
For realizing floodlighting function.Fig. 6 is 3D imaging devices schematic diagram according to an embodiment of the invention.3D imaging device bags
Support 601 is included, and is installed on therein first and gathers module 602, optical sensor 605, projects module 607 and for controlling
The circuit board 608 of each module.The 3D imaging devices can be embedded in terminal device with realize 3D imaging, ambient light detection,
The functions such as proximity detection, in certain embodiments, it can also increase by the second collection module 603, work(of taking pictures is realized such as RGB camera
Can, and increase receiver 604 to realize call function.
Optical sensor 605 contains Optical Transmit Unit, optical detecting unit 6053, and wherein Optical Transmit Unit contains the first light
The Optical Transmit Unit 6052 of transmitter unit 6051 and second, the second Optical Transmit Unit power is big and can to launch the angle of departure bigger
Light beam.Now, first the first light beam of light emitting units emitting is used for the detection of the degree of approach, second the second light of light emitting units emitting
Beam is used for the floodlighting of 3D sensors.
For 3D imaging functions, patterned beam, such as infrared speckle patterns are projected using module 607 is projected in device
Light beam, the first collection module 602 is corresponding infrared camera module, using its collection by the object after the irradiation of infrared spot beam
Spot image, the 3D information of object is finally calculated according to spot image using process circuit to realize that 3D is imaged.However, one
In a little applications, for example night carries out recognition of face, generally requires to provide the floodlight image under floodlighting, is now passed using light
The second transmitter unit 6052 in sensor 605 carries out floodlighting.The light beam wavelength of second Optical Transmit Unit 6052 transmitting should be with
The light beam wavelength that projecting cell 607 is launched is identical, and spot image and floodlight image can be caused to be adopted by same first
Collection module 602 gathers.The light beam ripple that the light beam wavelength of second Optical Transmit Unit 6052 transmitting can also be launched with projecting cell 607
It is long different, cause the partial pixel of imaging sensor in module using the optical filter of two kinds of wavelength in the first collection module 602
Pair it is imaged with the light beam wavelength identical light beam of the second transmitter unit transmitting, another part pixel pair launches with projecting cell 607
Light beam wavelength identical light beam imaging.
The first Optical Transmit Unit is arranged to different wavelength from the second Optical Transmit Unit in one embodiment, utilizes light
Optical detecting unit 6053 in sensor 605 detects the first light beam of the first Optical Transmit Unit corresponding wavelength, is adopted using first
Collect module 602 to detect the second light beam of the second Optical Transmit Unit corresponding wavelength, therebetween because the difference of wavelength will not produce
Life influences each other.
Above content is to combine specific/preferred embodiment further description made for the present invention, it is impossible to is recognized
The specific implementation of the fixed present invention is confined to these explanations.For general technical staff of the technical field of the invention,
Without departing from the inventive concept of the premise, it can also make some replacements or modification to the embodiment that these have been described,
And these are substituted or variant should all be considered as belonging to protection scope of the present invention.
Claims (10)
- A kind of 1. 3D imaging devices, it is characterised in that including:Module is projected, for projective patterns light beam;Optical sensor, including:For launching the first Optical Transmit Unit of the first light beam,For launching the second Optical Transmit Unit of the second light beam,And optical detecting unit, for detecting first light beam with judgment object close to distance;First collection module, gather what is irradiated by the patterned beam when the projection module projects the patterned beam First object image, and collection is irradiated by second light beam in the second light beam described in second light emitting units emitting The second target image.
- 2. 3D imaging devices according to claim 1, it is characterised in that the wavelength of the patterned beam and described second The wavelength of light beam is identical.
- 3. 3D imaging devices according to claim 1, it is characterised in that the wavelength of the patterned beam and described second The wavelength of light beam is different, and the first collection module is configured the optical filters of two kinds of wavelength to realize first object image and the The collection of two target images.
- 4. 3D imaging devices according to claim 1, it is characterised in that the wavelength of first light beam and second light The wavelength of beam is different.
- 5. 3D imaging devices according to claim 1, it is characterised in that the optical detecting unit is additionally operable to detect ambient light Intensity.
- 6. 3D imaging devices according to claim 5, it is characterised in that the optical detecting unit is in detection ambient light when institute The first Optical Transmit Unit is stated to be closed.
- 7. 3D imaging devices according to claim 6, it is characterised in that the optical detecting unit is launched in first light Unit detects first light beam and ambient light when opening, and detects first light beam using FD―TD method.
- 8. 3D imaging devices according to claim 1, it is characterised in that the power of first Optical Transmit Unit is less than institute The second Optical Transmit Unit is stated, the angle of departure of first Optical Transmit Unit is less than the angle of departure of second Optical Transmit Unit.
- 9. 3D imaging devices according to claim 1, it is characterised in that first Optical Transmit Unit and second light The integrated Optical Transmit Unit of transmitter unit, the integrated optical transmitter unit include being made up of multiple sub-light sources and being grouped control The array of source of system.
- 10. 3D imaging devices according to claim 1, it is characterised in that second target image is floodlight image;Institute Stating 3D imaging devices also includes the second collection module, and the second collection module is RGB camera.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710908882.5A CN107884066A (en) | 2017-09-29 | 2017-09-29 | Optical sensor and its 3D imaging devices based on flood lighting function |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710908882.5A CN107884066A (en) | 2017-09-29 | 2017-09-29 | Optical sensor and its 3D imaging devices based on flood lighting function |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107884066A true CN107884066A (en) | 2018-04-06 |
Family
ID=61781031
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710908882.5A Pending CN107884066A (en) | 2017-09-29 | 2017-09-29 | Optical sensor and its 3D imaging devices based on flood lighting function |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107884066A (en) |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108701233A (en) * | 2018-05-16 | 2018-10-23 | 深圳阜时科技有限公司 | A kind of light source module group, image acquiring device, identity recognition device and electronic equipment |
CN108924315A (en) * | 2018-08-08 | 2018-11-30 | 盎锐(上海)信息科技有限公司 | 3D photographic device and image pickup method for mobile terminal |
CN108919597A (en) * | 2018-07-30 | 2018-11-30 | 深圳阜时科技有限公司 | A kind of optical projection mould group |
CN109068117A (en) * | 2018-09-11 | 2018-12-21 | 深圳阜时科技有限公司 | Light source module group, 3D imaging system, identity recognition device and electronic equipment |
TWI661232B (en) * | 2018-05-10 | 2019-06-01 | 視銳光科技股份有限公司 | Integrated structure of flood illuminator and dot projector |
WO2019213864A1 (en) * | 2018-05-09 | 2019-11-14 | 深圳阜时科技有限公司 | Three-dimensional target mapping apparatus, personal identification apparatus and electronic device |
WO2020057208A1 (en) * | 2018-09-17 | 2020-03-26 | 深圳奥比中光科技有限公司 | Electronic device |
WO2020057207A1 (en) * | 2018-09-17 | 2020-03-26 | 深圳奥比中光科技有限公司 | Electronic device |
WO2020151493A1 (en) * | 2019-01-25 | 2020-07-30 | 深圳市光鉴科技有限公司 | Light projection system |
CN111538024A (en) * | 2020-03-24 | 2020-08-14 | 深圳奥比中光科技有限公司 | Filtering ToF depth measurement method and device |
JP2020161554A (en) * | 2019-03-25 | 2020-10-01 | 富士ゼロックス株式会社 | Light-emitting element array chip, light-emitting device, optical device, and information processing device |
WO2020194774A1 (en) * | 2019-03-25 | 2020-10-01 | 富士ゼロックス株式会社 | Light-emission device, optical device, and information processing device |
WO2020202592A1 (en) * | 2019-04-02 | 2020-10-08 | 富士ゼロックス株式会社 | Light-emitting device, optical device, and information processing device |
JP2020170761A (en) * | 2019-04-02 | 2020-10-15 | 富士ゼロックス株式会社 | Light emitting device, optical device and information processing unit |
WO2020208864A1 (en) * | 2019-04-10 | 2020-10-15 | 富士ゼロックス株式会社 | Light-emitting device, optical device, and information processing device |
JP2020174096A (en) * | 2019-04-10 | 2020-10-22 | 富士ゼロックス株式会社 | Light-emitting device, optical device, and information processor |
US20200409163A1 (en) * | 2018-09-17 | 2020-12-31 | Shenzhen Orbbec Co., Ltd. | Compensating display screen, under-screen optical system and electronic device |
JP2021027283A (en) * | 2019-08-08 | 2021-02-22 | 富士ゼロックス株式会社 | Light-emitting device, optical device, and information processing device |
CN113126111A (en) * | 2019-12-30 | 2021-07-16 | Oppo广东移动通信有限公司 | Time-of-flight module and electronic equipment |
CN114341674A (en) * | 2019-08-08 | 2022-04-12 | 麻省理工学院 | Ultra-wide view field planar optical device |
US11422262B2 (en) | 2019-01-15 | 2022-08-23 | Shenzhen Guangjian Technology Co., Ltd. | Switchable diffuser projection systems and methods |
WO2024016478A1 (en) * | 2022-07-18 | 2024-01-25 | 奥比中光科技集团股份有限公司 | 3d sensing module, 3d sensing method, and electronic device |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102901992A (en) * | 2011-07-26 | 2013-01-30 | 安华高科技Ecbuip(新加坡)私人有限公司 | Multi-directional proximity sensor |
CN103677255A (en) * | 2012-09-03 | 2014-03-26 | 三星电子株式会社 | Method and apparatusfor extracting three-dimensional distance information, terminals and gesture operation method |
CN104054008A (en) * | 2012-01-13 | 2014-09-17 | 松下电器产业株式会社 | Proximity sensor |
CN105094309A (en) * | 2014-05-09 | 2015-11-25 | 义明科技股份有限公司 | Optical sensing module and mobile device |
CN105229411A (en) * | 2013-04-15 | 2016-01-06 | 微软技术许可有限责任公司 | Sane three-dimensional depth system |
WO2016010481A1 (en) * | 2014-07-14 | 2016-01-21 | Heptagon Micro Optics Pte. Ltd. | Optoelectronic modules operable to distinguish between signals indicative of reflections from an object of interest and signals indicative of a spurious reflection |
CN105786104A (en) * | 2015-01-13 | 2016-07-20 | 摩托罗拉移动有限责任公司 | Portable Electronic Device with Dual, Diagonal Proximity Sensors and Mode Switching Functionality |
CN106550228A (en) * | 2015-09-16 | 2017-03-29 | 上海图檬信息科技有限公司 | Obtain the equipment of the depth map of three-dimensional scenic |
CN106845449A (en) * | 2017-02-22 | 2017-06-13 | 浙江维尔科技有限公司 | A kind of image processing apparatus, method and face identification system |
CN106921820A (en) * | 2015-12-24 | 2017-07-04 | 三星电机株式会社 | Imageing sensor and camera model |
CN107105217A (en) * | 2017-04-17 | 2017-08-29 | 深圳奥比中光科技有限公司 | Multi-mode depth calculation processor and 3D rendering equipment |
-
2017
- 2017-09-29 CN CN201710908882.5A patent/CN107884066A/en active Pending
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102901992A (en) * | 2011-07-26 | 2013-01-30 | 安华高科技Ecbuip(新加坡)私人有限公司 | Multi-directional proximity sensor |
CN104054008A (en) * | 2012-01-13 | 2014-09-17 | 松下电器产业株式会社 | Proximity sensor |
CN103677255A (en) * | 2012-09-03 | 2014-03-26 | 三星电子株式会社 | Method and apparatusfor extracting three-dimensional distance information, terminals and gesture operation method |
CN105229411A (en) * | 2013-04-15 | 2016-01-06 | 微软技术许可有限责任公司 | Sane three-dimensional depth system |
CN105094309A (en) * | 2014-05-09 | 2015-11-25 | 义明科技股份有限公司 | Optical sensing module and mobile device |
WO2016010481A1 (en) * | 2014-07-14 | 2016-01-21 | Heptagon Micro Optics Pte. Ltd. | Optoelectronic modules operable to distinguish between signals indicative of reflections from an object of interest and signals indicative of a spurious reflection |
CN105786104A (en) * | 2015-01-13 | 2016-07-20 | 摩托罗拉移动有限责任公司 | Portable Electronic Device with Dual, Diagonal Proximity Sensors and Mode Switching Functionality |
CN106550228A (en) * | 2015-09-16 | 2017-03-29 | 上海图檬信息科技有限公司 | Obtain the equipment of the depth map of three-dimensional scenic |
CN106921820A (en) * | 2015-12-24 | 2017-07-04 | 三星电机株式会社 | Imageing sensor and camera model |
CN106845449A (en) * | 2017-02-22 | 2017-06-13 | 浙江维尔科技有限公司 | A kind of image processing apparatus, method and face identification system |
CN107105217A (en) * | 2017-04-17 | 2017-08-29 | 深圳奥比中光科技有限公司 | Multi-mode depth calculation processor and 3D rendering equipment |
Cited By (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019213864A1 (en) * | 2018-05-09 | 2019-11-14 | 深圳阜时科技有限公司 | Three-dimensional target mapping apparatus, personal identification apparatus and electronic device |
TWI661232B (en) * | 2018-05-10 | 2019-06-01 | 視銳光科技股份有限公司 | Integrated structure of flood illuminator and dot projector |
CN108701233A (en) * | 2018-05-16 | 2018-10-23 | 深圳阜时科技有限公司 | A kind of light source module group, image acquiring device, identity recognition device and electronic equipment |
WO2019218274A1 (en) * | 2018-05-16 | 2019-11-21 | 深圳阜时科技有限公司 | Light source module, image acquisition apparatus, identity recognition apparatus, and electronic device |
CN108919597B (en) * | 2018-07-30 | 2024-02-13 | 深圳阜时科技有限公司 | Optical projection module |
CN108919597A (en) * | 2018-07-30 | 2018-11-30 | 深圳阜时科技有限公司 | A kind of optical projection mould group |
CN109186494A (en) * | 2018-07-30 | 2019-01-11 | 深圳阜时科技有限公司 | A kind of method for sensing |
CN108924315A (en) * | 2018-08-08 | 2018-11-30 | 盎锐(上海)信息科技有限公司 | 3D photographic device and image pickup method for mobile terminal |
CN109068117A (en) * | 2018-09-11 | 2018-12-21 | 深圳阜时科技有限公司 | Light source module group, 3D imaging system, identity recognition device and electronic equipment |
WO2020057207A1 (en) * | 2018-09-17 | 2020-03-26 | 深圳奥比中光科技有限公司 | Electronic device |
WO2020057208A1 (en) * | 2018-09-17 | 2020-03-26 | 深圳奥比中光科技有限公司 | Electronic device |
US20200409163A1 (en) * | 2018-09-17 | 2020-12-31 | Shenzhen Orbbec Co., Ltd. | Compensating display screen, under-screen optical system and electronic device |
US11422262B2 (en) | 2019-01-15 | 2022-08-23 | Shenzhen Guangjian Technology Co., Ltd. | Switchable diffuser projection systems and methods |
WO2020151493A1 (en) * | 2019-01-25 | 2020-07-30 | 深圳市光鉴科技有限公司 | Light projection system |
EP3951423A4 (en) * | 2019-03-25 | 2022-12-21 | Fujifilm Business Innovation Corp. | Light-emission device, optical device, and information processing device |
WO2020194774A1 (en) * | 2019-03-25 | 2020-10-01 | 富士ゼロックス株式会社 | Light-emission device, optical device, and information processing device |
JP7334439B2 (en) | 2019-03-25 | 2023-08-29 | 富士フイルムビジネスイノベーション株式会社 | vertical cavity surface emitting laser element array chip, light emitting device, optical device and information processing device |
WO2020194773A1 (en) * | 2019-03-25 | 2020-10-01 | 富士ゼロックス株式会社 | Light-emitting element array chip, light-emitting device, optical device, and information processing device |
JP2020161554A (en) * | 2019-03-25 | 2020-10-01 | 富士ゼロックス株式会社 | Light-emitting element array chip, light-emitting device, optical device, and information processing device |
US20210313776A1 (en) * | 2019-03-25 | 2021-10-07 | Fujifilm Business Innovation Corp. | Light-emission device, optical device, and information processing device |
JP2020170761A (en) * | 2019-04-02 | 2020-10-15 | 富士ゼロックス株式会社 | Light emitting device, optical device and information processing unit |
JP7413655B2 (en) | 2019-04-02 | 2024-01-16 | 富士フイルムビジネスイノベーション株式会社 | Light emitting device and information processing device |
WO2020202592A1 (en) * | 2019-04-02 | 2020-10-08 | 富士ゼロックス株式会社 | Light-emitting device, optical device, and information processing device |
EP3951424A4 (en) * | 2019-04-02 | 2023-02-08 | Fujifilm Business Innovation Corp. | Light-emitting device, optical device, and information processing device |
CN113574682B (en) * | 2019-04-10 | 2023-10-13 | 富士胶片商业创新有限公司 | Light emitting device, optical device, and information processing device |
US20220006268A1 (en) * | 2019-04-10 | 2022-01-06 | Fujifilm Business Innovation Corp. | Light-emitting device, optical device, and information processing device |
JP2020174097A (en) * | 2019-04-10 | 2020-10-22 | 富士ゼロックス株式会社 | Light-emitting device, optical device, and information processor |
JP2020174096A (en) * | 2019-04-10 | 2020-10-22 | 富士ゼロックス株式会社 | Light-emitting device, optical device, and information processor |
JP7413657B2 (en) | 2019-04-10 | 2024-01-16 | 富士フイルムビジネスイノベーション株式会社 | Optical equipment and information processing equipment |
WO2020208864A1 (en) * | 2019-04-10 | 2020-10-15 | 富士ゼロックス株式会社 | Light-emitting device, optical device, and information processing device |
CN114341674A (en) * | 2019-08-08 | 2022-04-12 | 麻省理工学院 | Ultra-wide view field planar optical device |
JP2021027283A (en) * | 2019-08-08 | 2021-02-22 | 富士ゼロックス株式会社 | Light-emitting device, optical device, and information processing device |
JP7363179B2 (en) | 2019-08-08 | 2023-10-18 | 富士フイルムビジネスイノベーション株式会社 | Light emitting devices, optical devices and information processing devices |
CN113126111A (en) * | 2019-12-30 | 2021-07-16 | Oppo广东移动通信有限公司 | Time-of-flight module and electronic equipment |
CN113126111B (en) * | 2019-12-30 | 2024-02-09 | Oppo广东移动通信有限公司 | Time-of-flight module and electronic device |
CN111538024A (en) * | 2020-03-24 | 2020-08-14 | 深圳奥比中光科技有限公司 | Filtering ToF depth measurement method and device |
WO2024016478A1 (en) * | 2022-07-18 | 2024-01-25 | 奥比中光科技集团股份有限公司 | 3d sensing module, 3d sensing method, and electronic device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107884066A (en) | Optical sensor and its 3D imaging devices based on flood lighting function | |
CN107845627A (en) | More proximity detection optical sensors | |
US10650789B2 (en) | Method and apparatus for controlling display screen statuses | |
US10403232B2 (en) | Method of controlling display screen states, and apparatus | |
CN207382424U (en) | TOF camera modules and electronic equipment | |
US7679671B2 (en) | Image capturing for capturing an image of an object by illuminating the object and receiving light from the object | |
CN110249613B (en) | Display screen state control method and device, storage medium and electronic equipment | |
CN106385511B (en) | Sensor module, panel assembly and mobile terminal | |
US7646423B2 (en) | Image capture apparatus with illuminator and distance measuring light emitting device | |
US7978259B2 (en) | Image capturing apparatus for guiding light emitted from a plurality of light emitting devices | |
CN106453723B (en) | Sensor assembly and terminal | |
CN110049214A (en) | Camera assembly and electronic equipment | |
CN109819173B (en) | Depth fusion method based on TOF imaging system and TOF camera | |
CN106453725A (en) | Terminal | |
CN106444997B (en) | Sensor assembly, cover plate assembly and mobile terminal | |
EP2535741B1 (en) | System and method for reduction of optical noise | |
CN109754425A (en) | The calibration facility and its scaling method of TOF camera module | |
CN206490707U (en) | Display screen and electronic equipment | |
CN109819144B (en) | TOF camera module and design method thereof | |
WO2021169531A1 (en) | Tof depth measurement apparatus, method for controlling tof depth measurement apparatus, and electronic device | |
CN102783123A (en) | Portable electronic device | |
CN106774656B (en) | Sensor assembly, cover plate, mobile terminal and terminal control method | |
CN211378086U (en) | Camera module type sensor device and camera module | |
CN107782354B (en) | Motion sensor detection system and method | |
CN209707866U (en) | Electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: 11-13 / F, joint headquarters building, high tech Zone, 63 Xuefu Road, Yuehai street, Nanshan District, Shenzhen, Guangdong 518000 Applicant after: Obi Zhongguang Technology Group Co.,Ltd. Address before: A808, Zhongdi building, industry university research base, China University of Geosciences, No.8, Yuexing Third Road, Nanshan District, Shenzhen, Guangdong 518000 Applicant before: SHENZHEN ORBBEC Co.,Ltd. |
|
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180406 |