CN109618085A - Electronic equipment and mobile platform - Google Patents

Electronic equipment and mobile platform Download PDF

Info

Publication number
CN109618085A
CN109618085A CN201910007534.XA CN201910007534A CN109618085A CN 109618085 A CN109618085 A CN 109618085A CN 201910007534 A CN201910007534 A CN 201910007534A CN 109618085 A CN109618085 A CN 109618085A
Authority
CN
China
Prior art keywords
flight time
optical receiver
application processor
initial depth
optical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910007534.XA
Other languages
Chinese (zh)
Other versions
CN109618085B (en
Inventor
张学勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910007534.XA priority Critical patent/CN109618085B/en
Publication of CN109618085A publication Critical patent/CN109618085A/en
Application granted granted Critical
Publication of CN109618085B publication Critical patent/CN109618085B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/40Transceivers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time

Abstract

This application discloses a kind of electronic equipment and mobile platform.Electronic equipment includes multiple flight time components of ontology with multiple and different orientation that ontology is arranged in.Each flight time component includes optical transmitting set and optical receiver, and the field angle of each optical receiver is the arbitrary value in 180 degree~200 degree, the field angle of each optical transmitting set be all larger than or equal to optical receiver field angle.Optical transmitting set is used to receive the laser pulse of the corresponding optical transmitting set transmitting of target subject reflection for emitting laser pulse, optical receiver to outside ontology.Multiple optical transmitting set time division emission laser pulses, multiple optical receiver different-time exposures are to obtain panoramic range image, and when the optical receiver exposure in any one flight time component, the optical transmitting set in other flight time components is turned off.The electronic equipment of the application, positioned at the optical transmitting set time division emission laser pulse of different direction, multiple optical receiver different-time exposures can disposably get more comprehensive depth information.

Description

Electronic equipment and mobile platform
Technical field
This application involves image acquisition technologies, more specifically, are related to a kind of electronic equipment and mobile platform.
Background technique
In order to enable the function of electronic equipment is more diversified, depth image can be set on electronic equipment and obtained dress It sets, to obtain the depth image of target subject.However, current integrated phase shift range finding is merely able to obtain a direction or one Depth image in a angular range, the depth information got are less.
Summary of the invention
The application embodiment provides a kind of electronic equipment and mobile platform.
The electronic equipment of the application embodiment includes the multiple flight time components of ontology and setting on the body, Multiple flight time components are located at multiple and different orientation of the ontology, and each flight time component includes Optical transmitting set and optical receiver, the field angle of each optical receiver are the arbitrary value in 180 degree~200 degree, Mei Gesuo The field angle for stating optical transmitting set be all larger than or equal to the optical receiver field angle, the optical transmitting set is used for the ontology Outer transmitting laser pulse, the optical receiver are used to receive the described of the corresponding optical transmitting set transmitting of target subject reflection Laser pulse;Laser pulse described in the optical transmitting set time division emission in multiple flight time components, it is multiple described winged The optical receiver different-time exposure in row time component is to receive the laser pulse, to obtain panoramic range image, when appoint The light when optical receiver in one flight time component of meaning exposes, in other flight time components Transmitter is turned off.
The mobile platform of the application embodiment includes the multiple flight time components of ontology and setting on the body, Multiple flight time components are located at multiple and different orientation of the ontology, and each flight time component includes Optical transmitting set and optical receiver, the field angle of each optical receiver are the arbitrary value in 180 degree~200 degree, Mei Gesuo The field angle for stating optical transmitting set be all larger than or equal to the optical receiver field angle, the optical transmitting set is used for the ontology Outer transmitting laser pulse, the optical receiver are used to receive the described of the corresponding optical transmitting set transmitting of target subject reflection Laser pulse;Laser pulse described in the optical transmitting set time division emission in multiple flight time components, it is multiple described winged The optical receiver different-time exposure in row time component is to receive the laser pulse, to obtain panoramic range image, when appoint The light when optical receiver in one flight time component of meaning exposes, in other flight time components Transmitter is turned off.
In the electronic equipment and mobile platform of the application embodiment, multiple light positioned at multiple and different orientation of ontology are sent out Emitter time division emission laser pulse, multiple optical receiver different-time exposures can disposably be got with obtaining panoramic range image More comprehensive depth information.
The additional aspect and advantage of presently filed embodiment will be set forth in part in the description, partially will be from following Description in become obvious, or recognized by the practice of presently filed embodiment.
Detailed description of the invention
The above-mentioned and/or additional aspect and advantage of the application is from combining in description of the following accompanying drawings to embodiment by change It obtains obviously and is readily appreciated that, in which:
Fig. 1 is the structural schematic diagram of the electronic equipment of the application certain embodiments;
Fig. 2 is the module diagram of the electronic equipment of the application certain embodiments;
Fig. 3 be multiple optical transmitting set time division emission laser pulses of the application certain embodiments time and multiple light The time diagram of receiver different-time exposure;
Fig. 4 (a) and Fig. 4 (b) be multiple optical transmitting set time division emission laser pulses of the application certain embodiments when Between and multiple optical receiver different-time exposures time diagram;
Fig. 5 (a) and Fig. 5 (b) be multiple optical transmitting set time division emission laser pulses of the application certain embodiments when Between and multiple optical receiver different-time exposures time diagram;
Fig. 6 (a) to Fig. 6 (c) be multiple optical transmitting set time division emission laser pulses of the application certain embodiments when Between and multiple optical receiver different-time exposures time diagram;
Fig. 7 is the module diagram of the electronic equipment of the application certain embodiments;
Fig. 8 is the application scenarios schematic diagram of the electronic equipment of the application certain embodiments;
Fig. 9 is the coordinate system schematic diagram of the initial depth image mosaic of the application certain embodiments;
Figure 10 to Figure 14 is the application scenarios schematic diagram of the electronic equipment of the application certain embodiments;
Figure 15 to Figure 18 is the structural schematic diagram of the mobile platform of the application certain embodiments.
Specific embodiment
Presently filed embodiment is described further below in conjunction with attached drawing.Same or similar label is from beginning in attached drawing To the same or similar element of expression or element with the same or similar functions eventually.The application's described with reference to the accompanying drawing Embodiment is exemplary, and is only used for explaining presently filed embodiment, and should not be understood as the limitation to the application.
Also referring to Fig. 1 and Fig. 2, the electronic equipment 100 of the application embodiment includes ontology 10, flight time component 20, CCD camera assembly 30, microprocessor 40 and application processor 50.
Ontology 10 includes multiple and different orientation.As shown in figure 1, ontology 10 can have there are four different direction example, along side clockwise To successively are as follows: first orientation, second orientation, third orientation and fourth orientation, first orientation is opposite with third orientation, second orientation It is opposite with fourth orientation.First orientation is the right side of orientation corresponding with the top of ontology 10, second orientation as with ontology 10 The corresponding orientation in side, third orientation are the left side of orientation corresponding with the lower section of ontology 10, fourth orientation as with ontology 10 Corresponding orientation.
Flight time component 20 is arranged on ontology 10.The quantity of flight time component 20 can be multiple, multiple flights Time component 20 is located at multiple and different orientation of ontology 10.Specifically, the quantity of flight time component 20 can be two, point It Wei not flight time component 20a and flight time component 20b.Flight time component 20a is arranged in first orientation, flight time group Part 20b is arranged in third orientation.Certainly, to may be four (or any other be greater than two to the quantity of flight time component 20 Quantity), two flight time components 20 in addition can be separately positioned on second orientation and fourth orientation.The application embodiment party Formula is illustrated so that the quantity of flight time component 20 is two as an example, it will be understood that two flight time components 20 can be real Now obtaining panoramic range image, (panoramic range image refers to that the field angle of the panoramic range image is greater than or equal to 180 degree, example Such as, the field angle of panoramic range image can be 180 degree, 240 degree, 360 degree, 480 degree, 720 degree etc.), be conducive to saving electronics The manufacturing cost of equipment 100 and the volume and the power consumption that reduce electronic equipment 100 etc..The electronic equipment 100 of present embodiment can To be the portable electronic devices such as mobile phone, tablet computer, the laptop for being provided with multiple flight time components 20, at this point, Ontology 10 can be handset, tablet computer fuselage, laptop fuselage etc..Electronic equipment higher for thickness requirement 100, for mobile phone, since mobile phone requires fuselage thinner thickness, thus the side of fuselage can not usually install flight time group Part 20, then can solve the above problem using two flight time components 20 to obtain the setting of panoramic range image, at this time Two flight time components 20 can be separately mounted to handset on the front and back.In addition, two flight time components 20 modes that can obtain panoramic range image are also beneficial to reduce the calculation amount of panoramic range image.Compared to setting structure light For component obtains panoramic range image, the precision that flight time component 20 measures the depth of remote subject is higher, Suitable for measuring remote panorama depth, and flight time component 20 calculate required processing when depth data volume it is less, Time less, more demanding suitable for the frame per second to panoramic range image applied field needed for obtaining multiframe panoramic range image Scape.
Each flight time component 20 includes optical transmitting set 22 and optical receiver 24.Optical transmitting set 22 is used for ontology 10 Outer transmitting laser pulse, optical receiver 24 are used to receive the laser arteries and veins that the corresponding optical transmitting set 22 of target subject reflection emits Punching.Specifically, flight time component 20a includes optical transmitting set 22a and optical receiver 24a, and flight time component 20b includes light hair Emitter 22b and optical receiver 24b.Optical transmitting set 22a, optical transmitting set 22b are respectively used to the outer first orientation of ontology 10, third party Position transmitting laser pulse, optical receiver 24a, optical receiver 24b are respectively used to receive the light of the target subject reflection of first orientation The laser pulse that the laser pulse of transmitter 22a transmitting, the optical transmitting set 22b of the target subject reflection in third orientation emit, from And each different zones outside ontology 10 can be covered, compared to it is existing needs be rotated by 360 ° could obtain it is more comprehensive For depth information, the electronic equipment 100 of present embodiment, which can not rotate, can disposably obtain more comprehensive depth letter It is rapid to execute simple and response speed for breath.
The field angle of each optical receiver 24 is the arbitrary value in 180 degree~200 degree, the visual field of each optical transmitting set 22 Angle be all larger than or equal to corresponding optical receiver 24 field angle.Wherein, the field angle of optical transmitting set 22 is greater than corresponding light and connects The field angle for receiving device 24 refers to that the field angle of optical transmitting set 22 is slightly larger than the field angle of corresponding optical receiver 24.For example, light The field angle of receiver 24 is 180 degree, then the field angle of corresponding optical transmitting set 22 can be 181 degree, 182.5 degree, 185 degree, 187 Degree, 188 degree, 190 degree etc.;The field angle of optical receiver 24 is 200 degree, then the field angle of corresponding optical transmitting set 22 can be 200.5 degree, 201 degree, 203 degree, 204 degree, 207 degree, 208.6 degree, 209 degree, 210 degree etc..Below with the visual field of optical receiver 24 It is illustrated for angle, the field angle of optical transmitting set 22 can be greater than or equal to the field angle of corresponding optical receiver 24, herein Not repeated explanation.
In one embodiment, the field angle of optical receiver 24a and optical receiver 24b are 180 degree.Optical receiver 24 When field angle is smaller, lens distortion is also smaller, and the initial depth picture quality of acquisition is preferable, the panoramic range image obtained from Quality is also preferable, and can get accurate depth information.
In one embodiment, the sum of field angle of optical receiver 24a and optical receiver 24b is equal to 360 degree.Specifically, The field angle of optical receiver 24a and optical receiver 24b can be 180 degree, and the visual field that two optical receivers 24 are mutual Angle is non-overlapping, obtains 360 degree or approximate 360 degree of panoramic range image to realize.
In one embodiment, the sum of field angle of optical receiver 24a and optical receiver 24b is greater than 360 degree, and two light connect It is overlapping to receive the mutual field angle of device 24.Specifically, the field angle of optical receiver 24a and optical receiver 24b can be 200 It spends, the field angle between two optical receivers 24 is mutually overlapping.When obtaining panoramic range image, can first identify two it is initial The edge overlapping part of depth image, then the panoramic range image for being 360 degree by two initial depth image mosaics.Due to two Field angle between optical receiver 24 is mutually overlapping, it can be ensured that outer 360 degree of the panoramic range image covering ontology 10 of acquisition Depth information.
Certainly, the specific value of the field angle of each optical receiver 24 is not limited to the example above, those skilled in the art Member, which can according to need, is set as any number between 180 degree~200 degree for the field angle of optical receiver 24, such as: light connects The field angle for receiving device 24 is 180 degree, 181 degree, 185 degree, 187 degree, 190 degree, 195 degree, 196.5 degree, 198 degree, 199 degree, 200 degree Or any arbitrary value between the two, it is preferable that the field angle of optical transmitting set 22 is also the arbitrary value in 180 degree~200 degree, only The field angle for needing to meet optical transmitting set 22 is greater than or equal to the field angle of corresponding optical receiver 24, does not limit herein System.
Under normal circumstances, when the field angle of optical transmitting set 22a and optical transmitting set 22b are mutually overlapping, optical transmitting set 22a and If optical transmitting set 22b emits laser pulse simultaneously, the laser pulse of optical transmitting set 22a and optical transmitting set 22b transmitting mutually it Between be easy to cause interference.Therefore, in order to improve acquisition depth information accuracy, light hair in two flight time components 20 Emitter 22 can be with time division emission laser pulse, corresponding, the also timesharing of optical receiver 24 in two flight time components 20 Exposure is to receive the laser pulse that corresponding optical transmitting set 22 emits, to obtain panoramic range image.Wherein, when any one flies When optical receiver 24 in row time component 20 exposes, the optical transmitting set 22 in another flight time component 20 is closed.Such as This, optical receiver 24 is only capable of receiving the laser pulse that corresponding optical transmitting set 22 emits, without collecting remaining light emitting The laser pulse that device 22 emits guarantees the accuracy of received laser pulse so as to avoid above-mentioned interference problem.
Specifically, Fig. 3 and Fig. 4 are please referred to, in one embodiment, two light emittings in two flight time components 20 Device 22 successively connects and emits incessantly laser pulse, the time for exposure of the optical receiver 24 in each flight time component 20 In the time range of the optical transmitting set 22 transmitting laser pulse.Optical transmitting set 22a and optical transmitting set 22b time division emission laser Pulse, and at the time of optical transmitting set 22a stops transmitting laser pulse, transmitting swashs optical transmitting set 22b from this moment immediately Light pulse, at the time of optical transmitting set 22b stops transmitting laser pulse, transmitting swashs optical transmitting set 22a from this moment immediately Light pulse.The time of optical transmitting set 22a and optical transmitting set 22b transmitting laser pulse collectively constitutes an alternate cycle T.At this point, The Exposure mode of optical receiver 24a and optical receiver 24b may include following two:
(1) optical receiver 24a and optical receiver 24b are successively connected and are exposed incessantly.Specifically, two optical receivers 24 time for exposure emits the time consistency of laser pulse with corresponding optical transmitting set 22 respectively.As shown in figure 3, optical receiver 24a and optical receiver 24b successively rapid alternation.The light emitting of the exposure start time and current alternate cycle T of optical receiver 24a Device 22a emits consistent, the light of the exposure cut-off time and current alternate cycle T of optical receiver 24a at the beginning of laser pulse The cut-off time that transmitter 22a emits laser pulse is consistent, and the exposure start time of optical receiver 24b and current alternate cycle It is consistent at the beginning of the optical transmitting set 22b transmitting laser pulse of T, the exposure cut-off time of optical receiver 24b with currently replace The cut-off time of the optical transmitting set 22b transmitting laser pulse of cycle T is consistent.At this point, optical receiver 24a is only capable of receiving light emitting The laser pulse of device 22a transmitting, and do not acquire the laser pulse of optical transmitting set 22b transmitting;Optical receiver 24b is only capable of receiving The laser pulse of optical transmitting set 22b transmitting, and do not receive the laser pulse of optical transmitting set 22a transmitting.Optical receiver 24a and light Receiver 24b is successively connected and in the control mode that exposes incessantly, optical receiver 24a and optical transmitting set 22a synchronously control, Optical receiver 24b and optical transmitting set 22b synchronously control, control logic are relatively simple.
(2) as shown in figure 4, optical receiver 24a and optical receiver 24b are successively connected and be spaced predetermined time exposure.Wherein, The time for exposure of at least one optical receiver 24 is less than the time that corresponding optical transmitting set 22 emits laser pulse.Specifically, As shown in Fig. 4 (a), in one example, optical receiver 24a and optical receiver 24b successively rapid alternation.Optical receiver 24a's Time for exposure is less than the time of optical transmitting set 22a transmitting laser pulse, and the time for exposure of optical receiver 24b is equal to optical transmitting set 22b emits the time of laser pulse, and the exposure start time of optical receiver 24a is greater than the optical transmitting set 22a of current alternate cycle T At the beginning of emitting laser pulse, exposure cut-off time, the optical transmitting set 22a less than current alternate cycle T emitted laser pulse Cut-off time, exposure start time of optical receiver 24b and exposure the cut-off time light emitting with current alternate cycle T respectively Device 22b emit laser pulse at the beginning of it is consistent with cut-off time, the exposure cut-off time of optical receiver 24a with currently replace Predetermined time Δ t is spaced between the exposure start time of the optical receiver 24b of cycle T1, when the exposure of optical receiver 24b ends It carves and is spaced predetermined time Δ t between the exposure start time of the optical receiver 24a of next alternate cycle T2, Δ t1With Δ t2It can With equal or different.Optical receiver 24a is only capable of receiving the laser pulse of optical transmitting set 22a transmitting, and optical receiver 24b is only capable of connecing Receive the laser pulse of optical transmitting set 22b transmitting.As shown in Fig. 4 (b), in another example, optical receiver 24a and light-receiving Device 24b successively rapid alternation.The time for exposure of optical receiver 24a is less than the time of optical transmitting set 22a transmitting laser pulse, and light connects Time of the time for exposure again smaller than optical transmitting set 22b transmitting laser pulse for receiving device 24b, when the exposure of optical receiver 24a starts At the beginning of carving the optical transmitting set 22a transmitting laser pulse greater than current alternate cycle T, exposure cut-off time is less than current hand over For the cut-off time of the optical transmitting set 22a transmitting laser pulse of cycle T, the exposure start time of optical receiver 24b is greater than current At the beginning of the optical transmitting set 22b transmitting laser pulse of alternate cycle T, exposure cut-off time is less than current alternate cycle T's Optical transmitting set 22b emits the cut-off time of laser pulse, and optical receiver 24a exposes the light of cut-off time and current alternate cycle T Predetermined time Δ t is spaced between the exposure start time of receiver 24b1, exposure cut-off time and the next friendship of optical receiver 24b For interval predetermined time Δ t between the exposure start time of the optical receiver 24a of cycle T2, Δ t1With Δ t2It can be equal or not Deng.Optical receiver 24a is only capable of receiving the laser pulse of optical transmitting set 22a transmitting, and optical receiver 24b is only capable of receiving light emitting The laser pulse of device 22b transmitting.Optical receiver 24a and optical receiver 24b is successively connected and is spaced the control of predetermined time exposure In mode, the time for exposure of at least one optical receiver 24 comes compared to the time of corresponding optical transmitting set 22 transmitting laser pulse Must be short, be conducive to the power consumption for reducing electronic equipment 100.
Two optical transmitting sets 22 in two flight time components 20 successively connect and emit laser pulse incessantly In control mode, the frame per second that flight time component 20 acquires initial depth image is higher, is suitable for acquisition initial depth image The more demanding scene of frame per second.
Fig. 5 and Fig. 6 are please referred to, in another embodiment, two optical transmitting sets 22 in two flight time components 20 It successively connects and is spaced predetermined time transmitting laser pulse, i.e. optical transmitting set 22a and optical transmitting set 22b alternate emission laser arteries and veins Punching, optical transmitting set 22a emits the cut-off time of laser pulse and the optical transmitting set 22b in current alternate cycle T emits laser arteries and veins Predetermined time Δ t is spaced between at the beginning of punching3, optical transmitting set 22b transmitting laser pulse cut-off time replace with next Predetermined time Δ t is spaced between at the beginning of optical transmitting set 22a transmitting laser pulse in cycle T4, Δ t3It can be with Δ t4 It is equal or different, wherein time and predetermined time the Δ t of optical transmitting set 22a and optical transmitting set 22b transmitting laser pulse3With Predetermined time Δ t4Collectively constitute an alternate cycle T.At this point, the Exposure mode of optical receiver 24a and optical receiver 24b can be with Including following two:
(1) optical receiver 24a and optical receiver 24b are successively connected and are exposed incessantly.Specifically, such as Fig. 5 (a) institute Show, in one example, the exposure start time of optical receiver 24a and the optical transmitting set 22a of current alternate cycle T emit laser It is consistent at the beginning of pulse, the cut-off of the optical transmitting set 22a transmitting laser pulse of exposure cut-off time and current alternate cycle T Moment is consistent, and the exposure start time of optical receiver 24b and the optical transmitting set 22a of current alternate cycle T emit laser pulse Cut-off time it is consistent, exposure cut-off time and next alternate cycle T optical transmitting set 22a transmitting laser pulse at the beginning of Unanimously.Optical receiver 24a is only capable of receiving the laser pulse of optical transmitting set 22a transmitting, and optical receiver 24b is only capable of receiving light hair The laser pulse of emitter 22b transmitting.As shown in Fig. 5 (b), in another example, the exposure start time of optical receiver 24a with It is consistent at the beginning of the optical transmitting set 22a transmitting laser pulse of current alternate cycle T, exposure cut-off time with currently replace week It is consistent at the beginning of the optical transmitting set 22b transmitting laser pulse of phase T, and exposure start time of optical receiver 24b and current Consistent, exposure cut-off time and next alternate cycle T at the beginning of the optical transmitting set 22b transmitting laser pulse of alternate cycle T It is consistent at the beginning of optical transmitting set 22a transmitting laser pulse.Optical receiver 24a is only capable of receiving optical transmitting set 22a transmitting Laser pulse, optical receiver 24b are only capable of receiving the laser pulse of optical transmitting set 22b transmitting.
(2) optical receiver 24a and optical receiver 24b is successively connected and is spaced predetermined time exposure.Specifically, such as Fig. 6 (a) It is shown, in one example, exposure start time of optical receiver 24a and exposure cut-off time respectively with current alternate cycle T Optical transmitting set 22a transmitting laser pulse at the beginning of, and the exposure start time of optical receiver 24b consistent with cut-off time When at the beginning of emitting laser pulse with the optical transmitting set 22b of current alternate cycle T respectively with exposure cut-off time and ending It carves unanimously, between the exposure start time of the optical receiver 24b of the exposure cut-off time and current alternate cycle T of optical receiver 24a Every predetermined time Δ t3, the exposure cut-off time of optical receiver 24b and the exposure of the optical receiver 24a of next alternate cycle T are opened Time at intervals beginning, Δ predetermined time t4.Optical receiver 24a is only capable of receiving the laser pulse of optical transmitting set 22a transmitting, light-receiving Device 24b is only capable of receiving the laser pulse of optical transmitting set 22b transmitting.As shown in Fig. 6 (b), in another example, optical receiver The exposure start time and exposure cut-off time of 24a emits laser pulse with the optical transmitting set 22a of current alternate cycle T respectively Start time is consistent with cut-off time, and the exposure start time of optical receiver 24b is greater than the optical transmitting set of current alternate cycle T 22a emits the cut-off time of laser pulse, and exposure cut-off time, the optical transmitting set 22a less than next alternate cycle T emitted laser At the beginning of pulse.The exposure cut-off time of optical receiver 24a and the exposure of the optical receiver 24b of current alternate cycle T are opened Time at intervals beginning, Δ predetermined time t5, the optical receiver 24a of the exposure cut-off time and next alternate cycle T of optical receiver 24b Exposure start time interval predetermined time Δ t6, Δ t5With Δ t6It can be equal or different.Optical receiver 24a is only capable of receiving The laser pulse of optical transmitting set 22a transmitting, optical receiver 24b are only capable of receiving the laser pulse of optical transmitting set 22b transmitting.Such as figure Shown in 6 (c), in another example, the exposure start time of optical receiver 24a is greater than the optical transmitting set of previous alternate cycle T 22b emits the cut-off time of laser pulse, and exposure cut-off time, the optical transmitting set 22b less than current alternate cycle T emitted laser At the beginning of pulse, and the exposure of optical receiver 24a of the exposure start time greater than current alternate cycle T of optical receiver 24b The exposure of light cut-off time, optical receiver 24a of the exposure cut-off time less than next alternate cycle T of optical receiver 24b start Moment.The exposure start time interval of the optical receiver 24b of the exposure cut-off time and current alternate cycle T of optical receiver 24a Predetermined time Δ t5, the exposure cut-off time of optical receiver 24b and the exposure of the optical receiver 24a of next alternate cycle T start Time at intervals predetermined time Δ t6, Δ t5With Δ t6It can be equal or different.Optical receiver 24a is only capable of receiving optical transmitting set 22a The laser pulse of transmitting, optical receiver 24b are only capable of receiving the laser pulse of optical transmitting set 22b transmitting.
Two optical transmitting sets 22 in two flight time components 20 successively connect and are spaced predetermined time transmitting laser arteries and veins In the control mode of punching, the frame per second that flight time component 20 acquires initial depth image is lower, is suitable for acquisition initial depth The frame per second of image requires lower scene, while being conducive to reduce the power consumption of electronic equipment 100.
Fig. 1 and Fig. 2 are please referred to, CCD camera assembly 30 is arranged on ontology 10.The quantity of CCD camera assembly 30 can be more It is a, the corresponding flight time component 20 of each CCD camera assembly 30.For example, when the quantity of flight time component 20 is two When, the quantity of CCD camera assembly 30 is also two, and two CCD camera assemblies 30 are separately positioned on first orientation and third orientation.
Multiple CCD camera assemblies 30 are connect with application processor 50.Each CCD camera assembly 30 is for acquiring target subject Scene image and export to application processor 50.In present embodiment, two CCD camera assemblies 30 are respectively used to acquisition first The scene image of the target subject in orientation, the scene image of the target subject in third orientation are simultaneously exported respectively to application processor 50.It is appreciated that each CCD camera assembly 30 it is identical as the field angle of optical receiver 24 of corresponding flight time component 20 or It is approximately uniform, so that each scene image can preferably be matched with corresponding initial depth image.
CCD camera assembly 30 can be visible image capturing head 32 or infrared pick-up head 34.When CCD camera assembly 30 When for visible image capturing head 32, scene image is visible images;When CCD camera assembly 30 is infrared pick-up head 34, scene Image is infrared light image.
Fig. 2 and Fig. 7 are please referred to, microprocessor 40 can be processing chip.
In one embodiment, as shown in Fig. 2, the quantity of microprocessor 40 can be to be multiple, each microprocessor 40 is right Answer a flight time component 20.For example, the quantity of flight time component 20 is two, microprocessor 40 in present embodiment Quantity be also two.Each microprocessor 40 and the optical transmitting set 22 and optical receiver 24 in corresponding flight time component 20 It is all connected with.Each microprocessor 40 can drive corresponding optical transmitting set 22 to emit laser pulse by driving circuit, and pass through two Two 22 time division emission laser pulses of optical transmitting set are realized in the control of a microprocessor 40.Each microprocessor 40 is also used to right The optical receiver 24 answered provides the clock information for receiving laser pulse so that optical receiver 24 exposes, and passes through two micro processs The different-time exposure of two optical receivers 24 is realized in the control of device 40.Each microprocessor 40 is also used to according to corresponding optical transmitting set The laser pulse and the received laser pulse of optical receiver 24 of 22 transmittings obtain initial depth image.For example, two microprocessors 40 obtain initial depth figure according to the laser pulse of optical transmitting set 22a transmitting and the received laser pulse of optical receiver 24a respectively Initial depth image P2 is obtained as P1, according to the received laser pulse of laser pulse optical receiver 24b that optical transmitting set 22b emits (as shown in the upper part of Fig. 8).Each microprocessor 40 can also to initial depth image carry out tiled, distortion correction, from Scheduling algorithm processing is calibrated, to improve the quality of initial depth image.
In another embodiment, as shown in fig. 7, the quantity of microprocessor 40 may be one.At this point, microprocessor 40 connect with optical transmitting set 22a, optical receiver 24a, optical transmitting set 22b, optical receiver 24b simultaneously.One microprocessor 40 can Corresponding optical transmitting set 22 is respectively driven with two different driving circuits of Time-sharing control and emits laser pulse, can be given with timesharing The clock information that two optical receivers 24 provide reception laser pulse makes 24 different-time exposure of optical receiver, and is successively sent out according to light The laser pulse and the received laser pulse of optical receiver 24a of emitter 22a transmitting obtain initial depth image P1, according to light emitting The received laser pulse of laser pulse optical receiver 24b of device 22b transmitting obtains (the upper part of such as Fig. 8 initial depth image P2 It is shown).For a microprocessor 40, processing speed faster, is delayed smaller two microprocessors 40.A but micro- place Device 40 is managed for two microprocessors 40, a microprocessor 100 is conducive to reduce the volume of electronic equipment 100, also Advantageously reduce the manufacturing cost of electronic equipment 100.
When microprocessor 40 is two, two microprocessors 40 are connect with application processor 50, by initial depth figure As being transmitted to application processor 50.When microprocessor 40 is one, a microprocessor 40 is connect with application processor 50, will Initial depth image transmitting is to application processor 50.In one embodiment, microprocessor 40 can pass through mobile Industry Processor Interface (Mobile Industry Processor Interface, MIPI) is connect with application processor 50, specifically, micro- place Manage credible performing environment (the Trusted Execution that device 40 passes through mobile industry processor interface and application processor 50 Environment, TEE) connection, the data (initial depth image) in microprocessor 40 are transmitted directly to credible execution ring In border, to improve the safety of the information in electronic equipment 100.Wherein, the code in credible performing environment and region of memory be all It is to be controlled by access control unit, it cannot be by untrusted performing environment (Rich Execution Environment, REE) Program accessed, credible performing environment and untrusted performing environment can be formed in application processor 50.
The system that application processor 50 can be used as electronic equipment 100.Application processor 50 can reset microprocessor 40, Wake up (wake) microprocessor 40, error correction (debug) microprocessor 40 etc..Application processor 50 can also be with electronic equipment 100 Multiple electronic components connect and control multiple electronic component and run in predetermined patterns, such as application processor 50 It connect with visible image capturing head 32 and infrared pick-up head 34, is shot with controlling visible image capturing head 32 and infrared pick-up head 34 Visible images and infrared light image, and handle the visible images and infrared light image;When electronic equipment 100 includes display screen When, application processor 50 can control display screen and show scheduled picture;Application processor 50 can be with controlling electronic devices 100 Antenna send or receive scheduled data etc..
Referring to Fig. 8, application processor 50 can be used for being obtained two microprocessors 40 according to the field angle of optical receiver 24 The two initial depth images taken synthesize a frame panoramic range image;Alternatively, application processor 50 can be used for according to light-receiving Two initial depth images that the field angle of device 24 successively obtains a microprocessor 40 synthesize a frame panoramic range image.
It specifically, is Y with longitudinal axis using transversal line as X-axis using the center of ontology 10 as center of circle O incorporated by reference to Fig. 1 and Fig. 9 Axis establishes rectangular coordinate system XOY, and in rectangular coordinate system XOY, the visual field of optical receiver 24a is between 350 degree~190 degree (rotation counterclockwise, rear same), the visual field of optical receiver 24b is between 170 degree~10 degree, then application processor 50 is according to two Initial depth image P1, initial depth image P2 are spliced into the panorama depth map of 360 degree of a frame by the field angle of optical receiver 24 As P12, so as to the use of depth information.
The laser pulse and the received laser pulse of optical receiver 24 that microprocessor 40 emits according to optical transmitting set 22 obtain To initial depth image in, target subject and the optical receiver in the orientation that the depth information of each pixel is corresponding orientation The distance between 24.That is, the depth information of each pixel is that the target subject of first orientation connects with light in initial depth image P1 Receive the distance between device 24a;The depth information of each pixel is the target subject and light in third orientation in initial depth image P2 The distance between receiver 24b.It is being the panorama depth map of 360 degree of a frame by multiple initial depth image mosaics in multiple orientation As during, first have to the depth information of each pixel in every initial depth image being converted to unitized depth information, Unitized depth information indicates each target subject in each orientation at a distance from some base position.Depth information is converted into system After one changes depth information, application processor 40 is facilitated to do the splicing of initial depth image according to unitized depth information.
Specifically, a frame of reference is selected, the frame of reference can be with the optical receiver 24 in some orientation Image coordinate system is also possible to select other coordinate systems as the frame of reference as the frame of reference.By taking Fig. 9 as an example, with xo-yo-zoCoordinate system is benchmark coordinate system.Coordinate system x shown in Fig. 9a-ya-zaFor the image coordinate system of optical receiver 24a, sit Mark system xb-yb-zbFor the image coordinate system of optical receiver 24b.Application processor 50 is according to coordinate system xa-ya-zaWith reference coordinate It is xo-yo-zoBetween spin matrix and translation matrix the depth information of each pixel in initial depth image P1 is converted into system One changes depth information, according to coordinate system xb-yb-zbWith frame of reference xo-yo-zoBetween spin matrix and translation matrix will be first The depth information of each pixel is converted to unitized depth information in beginning depth image P2.
After the completion of depth information conversion, multiple initial depth images are located under a unified frame of reference, and each Corresponding coordinate (the x of one pixel of initial depth imageo,yo,zo), then initial depth can be done by coordinate matching The splicing of image.For example, some pixel P in initial depth image P1aCoordinate be (xo1,yo1,zo1), initial deep Spend some pixel P in image P2bCoordinate be also (xo1,yo1,zo1), due to PaAnd PbUnder the current frame of reference Coordinate value having the same, then pixels illustrated point PaWith pixel PbIt is actually the same point, initial depth image P1 and initial When depth image P2 splices, pixel PaIt needs and pixel PbIt is overlapped.In this way, application processor 50 can pass through of coordinate The splicing of multiple initial depth images is carried out with relationship, and obtains 360 degree of panoramic range image.
It should be noted that the splicing that the matching relationship based on coordinate carries out initial depth image requires initial depth image Resolution ratio need be greater than a default resolution ratio.If being appreciated that the resolution ratio of initial depth image is lower, coordinate (xo,yo,zo) accuracy also can be relatively low, at this point, directly being matched according to coordinate, in fact it could happen that PaPoint and PbPoint is practical On be not overlapped, but differ an offset offset, and the value of offset be more than error bounds limit value the problem of.If image Resolution ratio it is higher, then coordinate (xo,yo,zo) accuracy also can be relatively high, at this point, directly being matched according to coordinate, i.e., Make PaPoint and PbPoint is practically without coincidence, differs an offset offset, but the value of offset can also be less than bouds on error Value will not influence too much the splicing of initial depth image that is, in the range of error permission.
Two initial depth images are spliced or closed it is appreciated that aforesaid way can be used in subsequent implementation mode At no longer illustrating one by one.
Two initial depth images can also be synthesized three-dimensional with corresponding two visible images by application processor 50 Scene image is watched with being shown for user.For example, two visible images are respectively visible images V1 and visible light figure As V2, then application processor 50 initial depth image P1 is synthesized with visible images V1 respectively, by initial depth image P2 and Visible images V2 synthesis, then two images after synthesis are spliced to obtain the three-dimensional scene images of 360 degree of a frame.Or Person, application processor 50 first splice initial depth image P1 and initial depth image P2 to obtain the panorama depth of 360 degree of a frame Image, and will be seen that light image V1 and visible images V2 splices to obtain the panorama visible images of 360 degree of a frame;Again by panorama Depth image and panorama visible images synthesize 360 degree of three-dimensional scene images.
Referring to Fig. 10, in one embodiment, application processor 50 for obtaining respectively according to two microprocessors 40 Two initial depth images and two scene images of two CCD camera assemblies 30 acquisition identify target subject, or according to one Two scene images of two initial depth images and the acquisition of two CCD camera assemblies 30 that a microprocessor 40 successively obtains are known Other target subject.
Specifically, when scene image is infrared light image, two infrared light images can be infrared light image I1 respectively With infrared light image I2.Application processor 50 is respectively according to initial depth image P1 and infrared light image I1 identification first orientation Target subject, the target subject that third orientation is identified according to initial depth image P2 and infrared light image I2.When scene image is When visible images, two visible images are visible images V1 and visible images V2 respectively.Application processor 50 is distinguished According to the target subject of initial depth image P1 and visible images V1 identification first orientation, according to initial depth image P2 and can The target subject in light-exposed image V2 identification third orientation.
When identifying target subject is to carry out recognition of face, application processor 50 is using infrared light image as scene image It is higher to carry out recognition of face accuracy.Application processor 50 carries out recognition of face according to initial depth image and infrared light image Process can be as follows:
Firstly, carrying out Face datection according to infrared light image determines target human face region.Since infrared light image includes The detailed information of scene can carry out Face datection according to infrared light image, to detect after getting infrared light image It whether include out face in infrared light image.If in infrared light image including face, extract in infrared light image where face Target human face region.
Then, In vivo detection processing is carried out to target human face region according to initial depth image.Due to every initial depth Image and infrared light image are corresponding, include the depth information of corresponding infrared light image in initial depth image, therefore, Depth information corresponding with target human face region can be obtained according to initial depth image.Further, since living body faces are Three-dimensional, and the face of the display such as picture, screen is then plane, it therefore, can be according to the target human face region of acquisition Depth information judge that target human face region is three-dimensional or plane, to carry out In vivo detection to target human face region.
If In vivo detection success, obtains the corresponding target face property parameters of target human face region, and according to target person Face property parameters carry out face matching treatment to the target human face region in infrared light image, obtain face matching result.Target Face character parameter refers to the parameter that can characterize the attribute of target face, can be to target person according to target face property parameters Face carries out identification and matching treatment.Target face property parameters include but is not limited to be face deflection angle, face luminance parameter, Face parameter, skin quality parameter, geometrical characteristic parameter etc..Electronic equipment 100 can be stored in advance joins for matched face character Number.After getting target face property parameters, so that it may by target face property parameters and pre-stored face character Parameter is compared.If target face property parameters are matched with pre-stored face character parameter, recognition of face passes through.
It should be pointed out that application processor 50 carries out the tool of recognition of face according to initial depth image and infrared light image Body process is not limited to this, such as application processor 50 can also detect facial contour according to initial depth visual aids, to mention High recognition of face precision etc..Application processor 50 according to initial depth image and visible images carry out the process of recognition of face with Application processor 50 is similar with the infrared light image progress process of recognition of face according to initial depth image, no longer separately explains herein It states.
Figure 10 and Figure 11 are please referred to, application processor 50 is also used to according to two initial depth images and two scene figures When as identification target subject failure, two initial depth being obtained two microprocessors 40 according to the field angle of optical receiver 24 Image synthesizes a frame and merges depth image, and two scene images that two CCD camera assemblies 30 are acquired synthesize frame merging Scene image, and target subject is identified according to merging depth image and merging scene image;Alternatively, application processor 50 is also used to When according to two initial depth images and two scene image identification target subject failures, according to the field angle of optical receiver 24 Two initial depth images that one microprocessor 40 is successively obtained synthesize a frame and merge depth image, by two cameras Two scene images that component 30 acquires synthesize a frame and merge scene image, and according to merging depth image and merge scene figure As identification target subject.
Specifically, in embodiment shown in Figure 10 and Figure 11, due to the optical receiver 24 of each flight time component 20 Field angle is limited, it is understood that there may be the half of face be located at initial depth image P1, the other half be located at the feelings of initial depth image P2 Initial depth image P1 and initial depth image P2 are synthesized a frame and merge depth image P12 by shape, application processor 50, and right Infrared light image I1 and infrared light image I2 (or visible images V1 and visible images V2) should be synthesized to a frame and merge field Scape image I12 (or V12), to identify mesh shot according to merging depth image P12 and merging scene image I12 (or V12) again Mark.
Figure 12 and Figure 13 are please referred to, in one embodiment, application processor 50 is used for according to multiple initial depth images Judge that the distance between target subject and electronic equipment 100 change.
Specifically, each optical receiver 24 can repeatedly receive laser pulse.For example, in the first moment t1Optical receiver 24a receives laser pulse, in the second moment t2Optical receiver 24b receives laser pulse (the first moment t1With the second moment t2It is located at In same alternate cycle T), at this point, two correspondences of microprocessor 40 obtain initial depth image P11, initial depth image P21, Or a microprocessor 40 successively obtains initial depth image P11, initial depth image P21;In third moment t3Light-receiving Device 24a receives laser pulse, in the 4th moment t4Optical receiver 24b receives laser pulse (third moment t3With the 4th moment t4Position In in same alternate cycle T), at this point, two correspondences of microprocessor 40 obtain initial depth image P12, initial depth image P22 or microprocessor 40 successively obtains initial depth image P12, initial depth image P22.Then, application processor 50 judge the target subject and electronic equipment 100 of first orientation according to initial depth image P11 and initial depth image P12 respectively The distance between variation, judge according to initial depth image P21 and initial depth image P22 the target subject in third orientation with it is electric The variation of the distance between sub- equipment 100.
It is appreciated that due to include in initial depth image target subject depth information, application processor 50 Can be changed according to the depth information at multiple continuous moment between the target subject and electronic equipment 100 that judge corresponding orientation away from From variation.
Figure 14 is please referred to, application processor 50 is also used to judging that distance change fails according to multiple initial depth images When, a frame, which is synthesized, according to two initial depth images that the field angle of optical receiver 24 obtains two microprocessors 40 merges Depth image, application processor 50 continuously perform synthesis step to obtain multiframe and continuously merge depth image, and according to multiframe Merge depth image and judges distance change;Alternatively, application processor 50 be also used to according to multiple initial depth images judge away from When from variation failure, two initial depth images successively being obtained a microprocessor 40 according to the field angle of optical receiver 24 It synthesizes a frame and merges depth image, application processor 50 continuously performs synthesis step and continuously merges depth map to obtain multiframe Picture, and depth image is merged according to multiframe and judges distance change.
Specifically, in embodiment shown in Figure 14, due to the field angle of the optical receiver 24 of each flight time component 20 It is limited, it is understood that there may be the half of face be located at initial depth image P11, the other half be located at the situation of initial depth image P21, answer With processor 50 by the first moment t1Initial depth image P11 and the second moment t2Initial depth image P21 synthesize a frame Merge depth image P121, and corresponding by third moment t3Initial depth image P12 and the 4th moment t4Initial depth image P22 synthesizes a frame and merges depth image P122, then merges depth image P121 and P122 weight according to this two frame after merging Newly judge distance change.
Figure 14 is please referred to, when judging that distance change reduces for distance according to multiple initial depth images, or according to more When frame merging depth image judges that distance change reduces for distance, application processor 50 can be improved from the more of the transmission of microprocessor 40 Open the frame per second acquired in initial depth image to judge the initial depth image of distance change.Specifically, when microprocessor 40 Number be it is multiple when, application processor 50 can improve from least one microprocessor 40 transmit multiple initial depth images in Acquire the frame per second to judge the initial depth image of distance change;When the number of microprocessor 40 is one, using processing Device 50 can improve the initial depth acquired from multiple initial depth images that the microprocessor 40 transmits to judge distance change Spend the frame per second of image.
It is appreciated that electronic equipment 100 can not prejudge when the distance between target subject and electronic equipment 100 reduce The distance, which reduces, whether there is risk, and therefore, it is initial deep that multiple transmitted from microprocessor 40 can be improved in application processor 50 The frame per second to judge the initial depth image of distance change is acquired in degree image, with the closer concern distance change. Specifically, when judging that the corresponding distance in some orientation reduces, the orientation is can be improved from microprocessor 40 in application processor 50 The frame per second to judge the initial depth image of distance change is acquired in multiple initial depth images of transmission.
For example, in the first moment t1One microprocessor 40 obtains initial depth image P11, in the second moment t2Another The acquisition of microprocessor 40 initial depth image P21 (or a microprocessor 40 is in the first moment t1Obtain initial depth image P11 and in the second moment t2Obtain initial depth image P21);In third moment t3One microprocessor 40 obtains initial depth figure As P12, in the 4th moment t4Another microprocessor 40 obtains initial depth image P22, and (or a microprocessor 40 is the Three moment t3Obtain initial depth image P12 and in the 4th moment t4Obtain initial depth image P22);In the 5th moment t5One Microprocessor 40 obtains initial depth image P13, in the 6th moment t6Another microprocessor 40 obtains initial depth image P23 (or a microprocessor 40 is in the 5th moment t5Obtain initial depth image P13 and in the 6th moment t6Obtain initial depth figure As P23);In the 7th moment t7One microprocessor 40 obtains initial depth image P14, in the 8th moment t8Another micro process The acquisition of device 40 initial depth image P24 (or a microprocessor 40 is in the 7th moment t7Obtain initial depth image P14 and 8th moment t8Obtain initial depth image P24).Wherein, the first moment t1With the second moment t2Positioned at the same alternate cycle T It is interior, third moment t3With the 4th moment t4In the same alternate cycle T, the 5th moment t5With the 6th moment t6Positioned at same In a alternate cycle T, the 7th moment t7With the 8th moment t8In the same alternate cycle T.
Under normal circumstances, the selection of application processor 50 initial depth image P11 and initial depth image P14 judges first The variation of the distance between the target subject in orientation and electronic equipment 100;Choose initial depth image P21 and initial depth image P24 judges that the distance between target subject and the electronic equipment 100 in third orientation change.Application processor 50 is adopted in each orientation The frame per second of collection initial depth image is to acquire a frame at interval of two frames, i.e., every three frame chooses a frame.
When judging that the corresponding distance of first orientation reduces according to initial depth image P11 and initial depth image P14, Application processor 50 can then choose initial depth image P11 and initial depth image P13 judge the target subject of first orientation with The variation of the distance between electronic equipment 100.The frame per second that application processor 50 acquires the initial depth image of first orientation becomes every It is spaced a frame and acquires a frame, i.e., every two frame chooses a frame.And the frame per second in other orientation remains unchanged, i.e., application processor 50 still selects Initial depth image P21 and initial depth image P24 is taken to judge distance change.
Certainly, application processor 50 can also be improved when judging that the corresponding distance in any one orientation reduces from Wei Chu Initial depth image of the acquisition to judge distance change in multiple initial depth images in each orientation that reason device 40 transmits Frame per second.That is: when the target subject and electronics for judging first orientation according to initial depth image P11 and initial depth image P14 When the distance between equipment 100 reduces, application processor 50 can then choose initial depth image P11 and initial depth image P13 Judge the variation of the distance between target subject and electronic equipment 100 of first orientation, choose initial depth image P21 and initial depth Degree image P23 judges that the distance between target subject and the electronic equipment 100 in third orientation change.
Application processor 50 can also judge the distance in conjunction with visible images or infrared light image when distance reduces Variation.Specifically, application processor 50 first identifies target subject according to visible images or infrared light image, then further according to more The initial depth image at a moment judges distance change, to set for different target subjects from different distance controlling electronics Standby 100 execute different operations.Alternatively, the control of microprocessor 40 improves the corresponding transmitting of optical transmitting set 22 and swashs when distance reduces The frequency etc. that light pulse and optical receiver 24 expose.
It should be noted that the electronic equipment 100 of present embodiment is also used as an external terminal, be fixedly mounted or It is removably mounted on the portable electronic device such as mobile phone, tablet computer, laptop outside, can also be fixedly mounted Make in the loose impediments such as vehicle body (as shown in Figure 13 and Figure 14), unmanned aerial vehicle body, robot body or ship ontology With.When specifically used, when electronic equipment 100 synthesizes a frame panorama depth map according to multiple initial depth images as previously described Picture, panoramic range image can be used for three-dimensional modeling, immediately positioning and map structuring (simultaneous localization And mapping, SLAM), augmented reality shows.When the identification target subject as previously described of electronic equipment 100, then can be applied to Recognition of face unlock, the payment of portable electronic device, or applied to the avoidance of robot, vehicle, unmanned plane, ship etc..When When electronic equipment 100 judges the variation of the distance between target subject and electronic equipment 100 as previously described, then it can be applied to machine The automatic runnings such as people, vehicle, unmanned plane, ship, object tracking etc..
Fig. 2 and Figure 15 are please referred to, the application embodiment also provides a kind of mobile platform 300.Mobile platform 300 includes this Body 10 and the multiple flight time components 20 being arranged on ontology 10.Multiple flight time components 20 are located at the more of ontology 10 A different direction.Each flight time component 20 includes optical transmitting set 22 and optical receiver 24.The view of each optical receiver 24 Rink corner is the arbitrary value in 180 degree~200 degree, and the field angle of each optical transmitting set 22 is all larger than or equal to optical receiver 24 Field angle.Optical transmitting set 22 is used to receive target subject reflection for emitting laser pulse, optical receiver 24 to outside ontology 10 The laser pulse that corresponding optical transmitting set 22 emits.22 time division emission laser arteries and veins of optical transmitting set in multiple flight time components 20 It rushes, 24 different-time exposure of optical receiver in multiple flight time components 20, to obtain panoramic range image.
Specifically, ontology 10 can be vehicle body, unmanned aerial vehicle body, robot body or ship ontology.
Figure 15 is please referred to, when ontology 10 is vehicle body, the quantity of multiple flight time components 20 is two, and two fly Row time component 20 is separately mounted to the two sides of vehicle body, for example, headstock and the tailstock are mounted on, alternatively, being mounted on the vehicle body right side On the left of side and vehicle body.Vehicle body can drive two flight time components 20 to move on road, construct in travelling route 360 degree of panoramic range images, using as Reference Map etc.;Or the initial depth image in multiple and different orientation is obtained, with identification Target subject judges that the distance between target subject and mobile platform 300 change, and accelerates, slows down, stops to control vehicle body Vehicle, detour etc. realize unmanned avoidance, for example, in vehicle when moving on road, if recognizing target subject and vehicle Distance reduces and target subject is the pit on road, then vehicle is slowed down with the first acceleration, if recognizing target subject and vehicle Distance reduce and target subject behave, then vehicle is slowed down with the second acceleration, wherein the absolute value of the first acceleration is less than The absolute value of second acceleration.In this way, executing different operations according to different target subjects when distance reduces, can make Vehicle is more intelligent.
Figure 16 is please referred to, when ontology 10 is unmanned aerial vehicle body, the quantity of multiple flight time components 20 is two, two Flight time component 20 is separately mounted to the opposite two sides of unmanned aerial vehicle body, such as front and rear sides or arranged on left and right sides, or It is mounted on the opposite two sides of the holder carried on unmanned aerial vehicle body.Unmanned aerial vehicle body can drive multiple flight time components 20 It flies in the sky, to be taken photo by plane, inspection etc., the panoramic range image that unmanned plane can will acquire is returned to ground control terminal, SLAM can directly be carried out.Multiple flight time components 20 can realize unmanned plane acceleration, deceleration, stopping, avoidance, object tracking.
Figure 17 is please referred to, when ontology 10 is robot body, such as sweeping robot, multiple flight time components 20 Quantity is two, and two flight time components 20 are separately mounted to the opposite two sides of robot body.Robot body can be with It drives multiple flight time components 20 to move at home, obtains the initial depth image in multiple and different orientation, to identify mesh shot Mark judges that the distance between target subject and mobile platform 300 change, to control robot body movement, realizes robot It removes rubbish, avoidance etc..
Figure 18 is please referred to, when ontology 10 is ship ontology, the quantity of multiple flight time components 20 is two, and two fly Row time component 20 is separately mounted to the opposite two sides of ship ontology.Ship ontology can drive flight time component 20 to transport It is dynamic, the initial depth image in multiple and different orientation is obtained, to accurately identify quilt in adverse circumstances (such as under the environment that hazes) It takes the photograph target, judge target subject and the variation of the distance between mobile platform 300, improve sea going safety etc..
The mobile platform 300 of the application embodiment be can movable independently platform, multiple flight time components 20 pacify On the ontology 10 of mobile platform 300, to obtain panoramic range image.And the electronic equipment of the application embodiment 100 Body generally can not be moved independently, and electronic equipment 100 can further be equipped on the dress that can be moved similar to mobile platform 300 etc. It sets, so that the device be helped to obtain panoramic range image.
It should be pointed out that it is above-mentioned to the ontology 10 of electronic equipment 100, it is flight time component 20, CCD camera assembly 30, micro- The explanation of processor 40 and application processor 50 is equally applicable to the mobile platform 300 of the application embodiment, herein not Repeat explanation.
Although embodiments herein has been shown and described above, it is to be understood that above-described embodiment is example Property, it should not be understood as the limitation to the application, those skilled in the art within the scope of application can be to above-mentioned Embodiment is changed, modifies, replacement and variant, and scope of the present application is defined by the claims and their equivalents.

Claims (17)

1. a kind of electronic equipment, which is characterized in that the electronic equipment includes:
Ontology;With
Multiple flight time components on the body are set, and multiple flight time components are located at the ontology Multiple and different orientation, each flight time component include optical transmitting set and optical receiver, each optical receiver Field angle is the arbitrary value in 180 degree~200 degree, and the field angle of each optical transmitting set is all larger than or connects equal to the light The field angle of device is received, the optical transmitting set is for emitting laser pulse to outside the ontology, and the optical receiver is for receiving quilt Take the photograph the laser pulse of the corresponding optical transmitting set transmitting of target reflection;
Laser pulse described in the optical transmitting set time division emission in multiple flight time components, multiple flight time The optical receiver different-time exposure in component is to receive the laser pulse, to obtain panoramic range image, when any one The optical transmitting set when optical receiver in the flight time component exposes, in other flight time components It is turned off.
2. electronic equipment according to claim 1, which is characterized in that the flight time component includes two, two institutes It states two in flight time component optical transmitting sets and successively connects and emit incessantly the laser pulse, it is each described The time for exposure of the optical receiver in flight time component is located at the hair of the optical transmitting set in the flight time component It penetrates in the time range of the laser pulse.
3. electronic equipment according to claim 1, which is characterized in that the flight time component includes two, two institutes It states two optical transmitting sets in flight time component and successively connects and be spaced the predetermined time and emit the laser pulse, two Two optical receivers in the flight time component successively connect and are spaced predetermined time exposure.
4. electronic equipment according to claim 1, which is characterized in that the flight time component includes two, two institutes It states two optical transmitting sets in flight time component and successively connects and be spaced the predetermined time and emit the laser pulse, two Two optical receivers in the flight time component are successively connected and are exposed incessantly.
5. electronic equipment according to claim 1, which is characterized in that the flight time component includes two, the electricity Sub- equipment further includes application processor and two microprocessors, each corresponding flight time group of the microprocessor Part, two microprocessors are connect with the application processor, and each microprocessor is corresponding described for handling The received laser pulse of the optical receiver of flight time component obtains initial depth image and is transmitted to the application Processor;The application processor is used for the field angle according to the optical receiver for two of two microprocessors acquisitions The initial depth image synthesizes panoramic range image described in a frame.
6. electronic equipment according to claim 1, which is characterized in that the flight time component includes two, the electricity Sub- equipment further includes that application processor and a microprocessor, the microprocessor are connect with the application processor, described micro- Processor is obtained for successively handling two received laser pulses of optical receiver of two flight time components To two initial depth images and it is transmitted to the application processor;The application processor is used for according to the optical receiver Two initial depth images that field angle obtains the microprocessor synthesize panoramic range image described in a frame.
7. electronic equipment according to claim 1, which is characterized in that the flight time component includes two, the electricity Sub- equipment further includes application processor and two microprocessors, each corresponding flight time group of the microprocessor Part, two microprocessors are connect with the application processor, and each microprocessor is corresponding described for handling The received laser pulse of the optical receiver of flight time component obtains initial depth image and is transmitted to the application Processor;
The electronic equipment further includes two CCD camera assemblies of setting on the body, and each CCD camera assembly is corresponding One flight time component, two CCD camera assemblies are connect with the application processor, each camera Component is used to acquire the scene image of the target subject and exports to the application processor;
The two initial depth images and two institutes that the application processor is used to be obtained according to two microprocessors Two scene images for stating CCD camera assembly acquisition identify the target subject.
8. electronic equipment according to claim 7, which is characterized in that the application processor is also used to according to two institutes When stating initial depth image and two scene images identification target subject failures, according to the visual field of the optical receiver Two initial depth images that two microprocessors obtain are synthesized a frame and merge depth image by angle, by two institutes Two scene images for stating CCD camera assembly acquisition synthesize a frame and merge scene image, and according to the merging depth map Picture and the merging scene image identify the target subject.
9. electronic equipment according to claim 1, which is characterized in that the flight time component includes two, the electricity Sub- equipment further includes that application processor and a microprocessor, the microprocessor are connect with the application processor, described micro- Processor is obtained for successively handling two received laser pulses of optical receiver of two flight time components To two initial depth images and it is transmitted to the application processor;
The electronic equipment further includes two CCD camera assemblies of setting on the body, and each CCD camera assembly is corresponding One flight time component, two CCD camera assemblies are connect with the application processor, each camera Component is used to acquire the scene image of the target subject and exports to the application processor;
It is taken the photograph described in two initial depth images and two of the application processor for being obtained according to the microprocessor Two scene images acquired as head assembly identify the target subject.
10. electronic equipment according to claim 9, which is characterized in that the application processor is also used to according to two When the initial depth image and two scene images identify target subject failure, according to the view of the optical receiver Two initial depth images that rink corner obtains the microprocessor synthesize a frame and merge depth image, described in two Two scene images of CCD camera assembly acquisition synthesize a frame and merge scene image, and according to the merging depth image The target subject is identified with the merging scene image.
11. electronic equipment according to claim 1, which is characterized in that the flight time component includes two, the electricity Sub- equipment further includes application processor and two microprocessors, each corresponding flight time group of the microprocessor Part, two microprocessors are connect with the application processor, and each microprocessor is corresponding described for handling Repeatedly the received laser pulse obtains initial depth image and is transmitted to described the optical receiver of flight time component Application processor;The application processor is used to judge the target subject and the electricity according to initial depth image described in multiple The variation of the distance between sub- equipment.
12. electronic equipment according to claim 11, which is characterized in that the application processor is also used to according to multiple When the initial depth image judges distance change failure, according to the field angle of the optical receiver by two micro- places Two initial depth images that reason device obtains synthesize a frame and merge depth image, and the application processor continuously performs conjunction At step to obtain the multiframe continuously merging depth image, and the merging depth image according to multiframe judges the distance Variation.
13. electronic equipment according to claim 1, which is characterized in that the flight time component includes two, the electricity Sub- equipment further includes that application processor and a microprocessor, the microprocessor are connect with the application processor, described micro- Processor is used to successively handle the multiple received laser pulse of the optical receiver of the corresponding flight time component It obtains initial depth image and is transmitted to the application processor;The application processor is used for according to initial depth described in multiple Image judges that the distance between the target subject and the electronic equipment change.
14. electronic equipment according to claim 13, which is characterized in that the application processor is also used to according to multiple When the initial depth image judges distance change failure, according to the field angle of the optical receiver by the microprocessor The two initial depth images obtained synthesize a frame and merge depth image, and the application processor continuously performs synthesis step Suddenly to obtain the multiframe continuously merging depth image, and the merging depth image according to multiframe judges that the distance becomes Change.
15. electronic equipment described in 1 to 14 any one according to claim 1, which is characterized in that the application processor is also used In when judging that the distance change reduces for distance, multiple the described initial depth images transmitted from the microprocessor are improved The middle frame per second acquired to judge the initial depth image of the distance change.
16. a kind of mobile platform, which is characterized in that the mobile platform includes:
Ontology;With
Multiple flight time components on the body are set, and multiple flight time components are located at the ontology Multiple and different orientation, each flight time component include optical transmitting set and optical receiver, each optical receiver Field angle is the arbitrary value in 180 degree~200 degree, and the field angle of each optical transmitting set is all larger than or connects equal to the light The field angle of device is received, the optical transmitting set is for emitting laser pulse to outside the ontology, and the optical receiver is for receiving quilt Take the photograph the laser pulse of the corresponding optical transmitting set transmitting of target reflection;
Laser pulse described in the optical transmitting set time division emission in multiple flight time components, multiple flight time The optical receiver different-time exposure in component is to receive the laser pulse, to obtain panoramic range image, when any one The optical transmitting set when optical receiver in the flight time component exposes, in other flight time components It is turned off.
17. mobile platform according to claim 16, which is characterized in that the ontology be vehicle body, unmanned aerial vehicle body, Robot body or ship ontology.
CN201910007534.XA 2019-01-04 2019-01-04 Electronic equipment and mobile platform Active CN109618085B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910007534.XA CN109618085B (en) 2019-01-04 2019-01-04 Electronic equipment and mobile platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910007534.XA CN109618085B (en) 2019-01-04 2019-01-04 Electronic equipment and mobile platform

Publications (2)

Publication Number Publication Date
CN109618085A true CN109618085A (en) 2019-04-12
CN109618085B CN109618085B (en) 2021-05-14

Family

ID=66016359

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910007534.XA Active CN109618085B (en) 2019-01-04 2019-01-04 Electronic equipment and mobile platform

Country Status (1)

Country Link
CN (1) CN109618085B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111246073A (en) * 2020-03-23 2020-06-05 维沃移动通信有限公司 Imaging device, method and electronic equipment
WO2022002162A1 (en) * 2020-05-29 2022-01-06 华为技术有限公司 Electronic device and depth image photographing method

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101494736A (en) * 2009-02-10 2009-07-29 杨立群 Filming system
CN102124320A (en) * 2008-06-18 2011-07-13 苏尔吉克斯有限公司 A method and system for stitching multiple images into a panoramic image
CN102129550A (en) * 2011-02-17 2011-07-20 华南理工大学 Scene perception method
US20130250045A1 (en) * 2012-03-23 2013-09-26 Electronics And Telecommunications Research Institute Apparatus and method for generating and consuming three-dimensional (3d) data format to generate realistic panoramic image
US20140094307A1 (en) * 2012-10-01 2014-04-03 Microsoft Corporation Multi-camera depth imaging
CN104055487A (en) * 2014-06-27 2014-09-24 辛勤 Method for triggering camera and portable physiological parameter measurement equipment
CN104055489A (en) * 2014-07-01 2014-09-24 李栋 Blood vessel imaging device
US20160317121A1 (en) * 2013-12-16 2016-11-03 Universitat Bern Computed ultrasound tomography in echo mode (cute) for imaging speed of sound using pulse-echo sonography
CN106461783A (en) * 2014-06-20 2017-02-22 高通股份有限公司 Automatic multiple depth cameras synchronization using time sharing
CN106991716A (en) * 2016-08-08 2017-07-28 深圳市圆周率软件科技有限责任公司 A kind of panorama three-dimensional modeling apparatus, method and system
CN107263480A (en) * 2017-07-21 2017-10-20 深圳市萨斯智能科技有限公司 A kind of robot manipulation's method and robot
CN107465906A (en) * 2017-08-09 2017-12-12 广东欧珀移动通信有限公司 Panorama shooting method, device and the terminal device of scene
CN107742296A (en) * 2017-09-11 2018-02-27 广东欧珀移动通信有限公司 Dynamic image generation method and electronic installation
US20180103213A1 (en) * 2016-10-06 2018-04-12 Fyusion, Inc. Live style transfer on a mobile device
CN107924040A (en) * 2016-02-19 2018-04-17 索尼公司 Image pick-up device, image pickup control method and program
US20180139431A1 (en) * 2012-02-24 2018-05-17 Matterport, Inc. Capturing and aligning panoramic image and depth data
CN108174180A (en) * 2018-01-02 2018-06-15 京东方科技集团股份有限公司 A kind of display device, display system and 3 D displaying method
CN108471487A (en) * 2017-02-23 2018-08-31 钰立微电子股份有限公司 Generate the image device and associated picture device of panoramic range image

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102124320A (en) * 2008-06-18 2011-07-13 苏尔吉克斯有限公司 A method and system for stitching multiple images into a panoramic image
CN101494736A (en) * 2009-02-10 2009-07-29 杨立群 Filming system
CN102129550A (en) * 2011-02-17 2011-07-20 华南理工大学 Scene perception method
US20180139431A1 (en) * 2012-02-24 2018-05-17 Matterport, Inc. Capturing and aligning panoramic image and depth data
US20130250045A1 (en) * 2012-03-23 2013-09-26 Electronics And Telecommunications Research Institute Apparatus and method for generating and consuming three-dimensional (3d) data format to generate realistic panoramic image
US20140094307A1 (en) * 2012-10-01 2014-04-03 Microsoft Corporation Multi-camera depth imaging
US20160317121A1 (en) * 2013-12-16 2016-11-03 Universitat Bern Computed ultrasound tomography in echo mode (cute) for imaging speed of sound using pulse-echo sonography
CN106461783A (en) * 2014-06-20 2017-02-22 高通股份有限公司 Automatic multiple depth cameras synchronization using time sharing
CN104055487A (en) * 2014-06-27 2014-09-24 辛勤 Method for triggering camera and portable physiological parameter measurement equipment
CN104055489A (en) * 2014-07-01 2014-09-24 李栋 Blood vessel imaging device
CN107924040A (en) * 2016-02-19 2018-04-17 索尼公司 Image pick-up device, image pickup control method and program
CN106991716A (en) * 2016-08-08 2017-07-28 深圳市圆周率软件科技有限责任公司 A kind of panorama three-dimensional modeling apparatus, method and system
US20180103213A1 (en) * 2016-10-06 2018-04-12 Fyusion, Inc. Live style transfer on a mobile device
CN108471487A (en) * 2017-02-23 2018-08-31 钰立微电子股份有限公司 Generate the image device and associated picture device of panoramic range image
CN107263480A (en) * 2017-07-21 2017-10-20 深圳市萨斯智能科技有限公司 A kind of robot manipulation's method and robot
CN107465906A (en) * 2017-08-09 2017-12-12 广东欧珀移动通信有限公司 Panorama shooting method, device and the terminal device of scene
CN107742296A (en) * 2017-09-11 2018-02-27 广东欧珀移动通信有限公司 Dynamic image generation method and electronic installation
CN108174180A (en) * 2018-01-02 2018-06-15 京东方科技集团股份有限公司 A kind of display device, display system and 3 D displaying method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张驰 等: "光场成像技术及其在计算机视觉中的应用", 《中国图象图形学报》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111246073A (en) * 2020-03-23 2020-06-05 维沃移动通信有限公司 Imaging device, method and electronic equipment
CN111246073B (en) * 2020-03-23 2022-03-25 维沃移动通信有限公司 Imaging device, method and electronic equipment
WO2022002162A1 (en) * 2020-05-29 2022-01-06 华为技术有限公司 Electronic device and depth image photographing method

Also Published As

Publication number Publication date
CN109618085B (en) 2021-05-14

Similar Documents

Publication Publication Date Title
CN109862275A (en) Electronic equipment and mobile platform
CN108027441A (en) Mixed mode depth detection
CN108174180B (en) A kind of display device, display system and 3 D displaying method
CN110006343A (en) Measurement method, device and the terminal of object geometric parameter
US20040046737A1 (en) Information input apparatus, information input method, and recording medium
US20210232858A1 (en) Methods and systems for training an object detection algorithm using synthetic images
CN109618108A (en) Electronic equipment and mobile platform
CN109831660A (en) Depth image acquisition method, depth image obtaining module and electronic equipment
JP2023509137A (en) Systems and methods for capturing and generating panoramic 3D images
EP3968284A1 (en) Model acquisition method, object pre-determination method and devices
CN110213413A (en) The control method and electronic device of electronic device
CN108885487A (en) A kind of gestural control method of wearable system and wearable system
CN109618085A (en) Electronic equipment and mobile platform
CN109688400A (en) Electronic equipment and mobile platform
WO2022161386A1 (en) Pose determination method and related device
CN109587304A (en) Electronic equipment and mobile platform
CN109803089A (en) Electronic equipment and mobile platform
CN109660731A (en) Electronic equipment and mobile platform
CN107330421A (en) Input and output module and electronic installation
CN109788172A (en) Electronic equipment and mobile platform
CN109788195A (en) Electronic equipment and mobile platform
CN109587303A (en) Electronic equipment and mobile platform
CN109660733A (en) Electronic equipment and mobile platform
US11943539B2 (en) Systems and methods for capturing and generating panoramic three-dimensional models and images
US20240046560A1 (en) Three-Dimensional Model Reconstruction Method, Device, and Storage Medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant