CN109788195A - Electronic equipment and mobile platform - Google Patents
Electronic equipment and mobile platform Download PDFInfo
- Publication number
- CN109788195A CN109788195A CN201910007545.8A CN201910007545A CN109788195A CN 109788195 A CN109788195 A CN 109788195A CN 201910007545 A CN201910007545 A CN 201910007545A CN 109788195 A CN109788195 A CN 109788195A
- Authority
- CN
- China
- Prior art keywords
- optical receiver
- laser pulse
- transmitting set
- optical
- application processor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000003287 optical effect Effects 0.000 claims abstract description 586
- 230000008859 change Effects 0.000 claims description 30
- 230000000712 assembly Effects 0.000 claims description 11
- 238000000429 assembly Methods 0.000 claims description 11
- 210000001367 artery Anatomy 0.000 claims description 10
- 238000004080 punching Methods 0.000 claims description 10
- 210000003462 vein Anatomy 0.000 claims description 10
- 230000015572 biosynthetic process Effects 0.000 claims description 8
- 238000003786 synthesis reaction Methods 0.000 claims description 8
- 230000010354 integration Effects 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 12
- 230000000007 visual effect Effects 0.000 description 10
- 239000011159 matrix material Substances 0.000 description 7
- 230000001133 acceleration Effects 0.000 description 6
- 238000000034 method Methods 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 5
- 238000013519 translation Methods 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 238000001727 in vivo Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000010006 flight Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000033001 locomotion Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000002411 adverse Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 230000010363 phase shift Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000009183 running Effects 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
Landscapes
- Optical Radar Systems And Details Thereof (AREA)
Abstract
This application discloses a kind of electronic equipment and mobile platform.Electronic equipment includes the multiple flight time components of ontology and setting on the body.Multiple flight time components are located at multiple and different orientation of ontology.Each flight time component includes optical transmitting set and optical receiver.Optical transmitting set is used to receive the laser pulse of the corresponding optical transmitting set transmitting of target subject reflection for emitting laser pulse, optical receiver to outside ontology.Optical transmitting set time division emission laser pulse in the flight time component of adjacent orientation, the optical receiver different-time exposure in the flight time component of adjacent orientation, to obtain panoramic range image.In the electronic equipment and mobile platform of the application embodiment, optical transmitting set time division emission laser, optical receiver different-time exposure positioned at the adjacent orientation of ontology can disposably get more comprehensive depth information to obtain panoramic range image.
Description
Technical field
This application involves image acquisition technologies, more specifically, are related to a kind of electronic equipment and mobile platform.
Background technique
In order to enable the function of electronic equipment is more diversified, depth image can be set on electronic equipment and obtained dress
It sets, to obtain the depth image of target subject.However, current integrated phase shift range finding is merely able to obtain a direction or one
Depth image in a angular range, the depth information got are less.
Summary of the invention
The application embodiment provides a kind of electronic equipment and mobile platform.
The electronic equipment of the application embodiment includes the multiple flight time components of ontology and setting on the body,
Multiple flight time components are located at multiple and different orientation of the ontology, and each flight time component includes
Optical transmitting set and optical receiver, for emitting laser pulse to outside the ontology, the optical receiver is used for the optical transmitting set
Receive the laser pulse of the corresponding optical transmitting set transmitting of target subject reflection;The flight time of adjacent orientation
Laser pulse described in the optical transmitting set time division emission in component, the light in the flight time component of adjacent orientation
Receiver different-time exposure, to obtain panoramic range image.
The mobile platform of the application embodiment includes the multiple flight time components of ontology and setting on the body,
Multiple flight time components are located at multiple and different orientation of the ontology, and each flight time component includes
Optical transmitting set and optical receiver, for emitting laser pulse to outside the ontology, the optical receiver is used for the optical transmitting set
Receive the laser pulse of the corresponding optical transmitting set transmitting of target subject reflection;The flight time of adjacent orientation
Laser pulse described in the optical transmitting set time division emission in component, the light in the flight time component of adjacent orientation
Receiver different-time exposure, to obtain panoramic range image.
In the electronic equipment and mobile platform of the application embodiment, positioned at the optical transmitting set timesharing of the adjacent orientation of ontology
Emit laser, optical receiver different-time exposure, to obtain panoramic range image, can disposably get more comprehensive depth letter
Breath.
The additional aspect and advantage of presently filed embodiment will be set forth in part in the description, partially will be from following
Description in become obvious, or recognized by the practice of presently filed embodiment.
Detailed description of the invention
The above-mentioned and/or additional aspect and advantage of the application is from combining in description of the following accompanying drawings to embodiment by change
It obtains obviously and is readily appreciated that, in which:
Fig. 1 is the structural schematic diagram of the electronic equipment of the application certain embodiments;
Fig. 2 is the module diagram of the electronic equipment of the application certain embodiments;
Fig. 3 be multiple optical transmitting set time division emission laser pulses of the application certain embodiments time and multiple light
The time diagram of receiver different-time exposure;
Fig. 4 (a) and Fig. 4 (b) be multiple optical transmitting set time division emission laser pulses of the application certain embodiments when
Between and multiple optical receiver different-time exposures time diagram;
Fig. 5 (a) and Fig. 5 (b) be multiple optical transmitting set time division emission laser pulses of the application certain embodiments when
Between and multiple optical receiver different-time exposures time diagram;
Fig. 6 (a) to Fig. 6 (c) be multiple optical transmitting set time division emission laser pulses of the application certain embodiments when
Between and multiple optical receiver different-time exposures time diagram;
Fig. 7 be the optical transmitting set time division emission laser pulse of the adjacent orientation of the application certain embodiments time and
The time diagram of the optical receiver different-time exposure of adjacent orientation;
Fig. 8 is the module diagram of the electronic equipment of the application certain embodiments;
Fig. 9 is the application scenarios schematic diagram of the electronic equipment of the application certain embodiments;
Figure 10 is the coordinate system schematic diagram of the initial depth image mosaic of the application certain embodiments;
Figure 11 to Figure 15 is the application scenarios schematic diagram of the electronic equipment of the application certain embodiments;
Figure 16 to Figure 19 is the structural schematic diagram of the mobile platform of the application certain embodiments.
Specific embodiment
Presently filed embodiment is described further below in conjunction with attached drawing.Same or similar label is from beginning in attached drawing
To the same or similar element of expression or element with the same or similar functions eventually.The application's described with reference to the accompanying drawing
Embodiment is exemplary, and is only used for explaining presently filed embodiment, and should not be understood as the limitation to the application.
Also referring to Fig. 1 and Fig. 2, the electronic equipment 100 of the application embodiment includes ontology 10, flight time component
20, CCD camera assembly 30, microprocessor 40 and application processor 50.
Ontology 10 includes multiple and different orientation.As shown in figure 1, ontology 10 can have there are four different direction example, along side clockwise
To successively are as follows: first orientation, second orientation, third orientation and fourth orientation, first orientation is opposite with third orientation, second orientation
It is opposite with fourth orientation.First orientation is orientation corresponding with the right side of ontology 10, second orientation as and under ontology 10
The corresponding orientation in side, third orientation are the top of orientation corresponding with the left side of ontology 10, fourth orientation as with ontology 10
Corresponding orientation.
Flight time component 20 is arranged on ontology 10.The quantity of flight time component 20 can be multiple, multiple flights
Time component 20 is located at multiple and different orientation of ontology 10.Specifically, the quantity of flight time component 20 can be four, point
It Wei not flight time component 20a, flight time component 20b, flight time component 20c and flight time component 20d.Flight time
Component 20a setting is arranged in second orientation, flight time component 20c in third in first orientation, the setting of flight time component 20b
Orientation, flight time component 20d are arranged in fourth orientation.Certainly, the quantity of flight time component 20 may be eight (or its
He is arbitrarily greater than two quantity, especially any quantity for being greater than four), first orientation, second orientation, third orientation and the
Two (or other quantity) flight time components 20 can be respectively arranged in four orientation.The application embodiment is with flight time component 20
Quantity is illustrated for four, it will be understood that four flight time components 20 can be realized that obtain panoramic range image (complete
Scape depth image refers to that the field angle of the panoramic range image is greater than or equal to 180 degree, for example, the visual field of panoramic range image
Angle can be 180 degree, 240 degree, 360 degree, 480 degree, 720 degree etc.), and be conducive to saving electronic equipment 100 manufacturing cost, with
And volume and power consumption for reducing electronic equipment 100 etc..The electronic equipment 100 of present embodiment can be when being provided with multiple flights
Between component 20 the portable electronic devices such as mobile phone, tablet computer, laptop, at this point, ontology 10 can be handset,
Tablet computer fuselage, laptop fuselage etc..
Each flight time component 20 includes optical transmitting set 22 and optical receiver 24.Optical transmitting set 22 is used for ontology 10
Outer transmitting laser pulse, optical receiver 24 are used to receive the laser arteries and veins that the corresponding optical transmitting set 22 of target subject reflection emits
Punching.Specifically, flight time component 20a includes optical transmitting set 22a and optical receiver 24a, and flight time component 20b includes light hair
Emitter 22b and optical receiver 24b, flight time component 20c include optical transmitting set 22c and optical receiver 24c, flight time component
20d includes optical transmitting set 22d and optical receiver 24d.Optical transmitting set 22a, optical transmitting set 22b, optical transmitting set 22c and optical transmitting set
22d is respectively used to emit laser pulse, light-receiving to the outer first orientation of ontology 10, second orientation, third orientation and fourth orientation
Device 24a, optical receiver 24b, optical receiver 24c and optical receiver 24d are respectively used to receive the target subject reflection of first orientation
Optical transmitting set 22a transmitting laser pulse, second orientation target subject reflection optical transmitting set 22b transmitting laser arteries and veins
Punching, the laser pulse of optical transmitting set 22c transmitting of the target subject reflection in third orientation, the target subject of fourth orientation reflect
The laser pulse of optical transmitting set 22d transmitting, so as to cover each different zones outside ontology 10, compared to existing needs
It is rotated by 360 ° for could obtaining more comprehensive depth information, the electronic equipment 100 of present embodiment can not rotate can
It is disposable to obtain more comprehensive depth information, it is rapid to execute simple and response speed.
The field angle of each optical transmitting set 22 and each optical receiver 24 is the arbitrary value in 80 degree~100 degree.Below
It is illustrated by taking the field angle of optical receiver 24 as an example, the field angle of optical transmitting set 22 can be with the view of corresponding optical receiver 24
Rink corner is identical or approximately uniform, not repeated explanation herein.
In one embodiment, the visual field of optical receiver 24a, optical receiver 24b, optical receiver 24c and optical receiver 24d
Angle is 80 degree.When the field angle of optical receiver 24 is no more than 80 degree, lens distortion is smaller, the initial depth image matter of acquisition
Preferably, the panoramic range image quality obtained from is also preferable, and can get accurate depth information for amount.
In one embodiment, the visual field of optical receiver 24a, optical receiver 24b, optical receiver 24c and optical receiver 24d
The sum of angle is equal to 360 degree.Specifically, the visual field of optical receiver 24a, optical receiver 24b, optical receiver 24c and optical receiver 24d
Angle can be 90 degree, and four mutual field angles of optical receiver 24 are non-overlapping, obtain 360 degree or approximate to realize
360 degree of panoramic range image.Alternatively, the field angle of optical receiver 24a can be 100 for the field angle of 80 degree, optical receiver 24b
Degree, optical receiver 24c field angle be 80 degree, the field angle of optical receiver 24d is 100 degree etc., four optical receivers 24 pass through
Angled complimentary, which is realized, obtains 360 degree or approximate 360 degree of panoramic range image.
In one embodiment, the visual field of optical receiver 24a, optical receiver 24b, optical receiver 24c and optical receiver 24d
The sum of angle is greater than 360 degree, and the mutual field angle of at least two optical receivers 24 in four optical receivers 24 is overlapping.Specifically
Ground, optical receiver 24a, optical receiver 24b, optical receiver 24c and optical receiver 24d field angle can be 100 degree, four
The field angle of optical receiver 24 between any two is mutually overlapping.When obtaining panoramic range image, four initial depths can be first identified
Spend the edge overlapping part of image, then the panoramic range image for being 360 degree by four initial depth image mosaics.Due to four light
The field angle of receiver 24 between any two is mutually overlapping, it can be ensured that the panoramic range image covering ontology 10 of acquisition is 360 degree outer
Depth information.
Certainly, the specific value of the field angle of each optical receiver 24 (and each optical transmitting set 22) is not limited to above-mentioned act
Example, those skilled in the art can according to need by the field angle of optical receiver 24 (and optical transmitting set 22) be set as 80 degree~
Any number between 100 degree, such as: the field angle of optical receiver 24 is 80 degree, 82 degree, 84 degree, 86 degree, 90 degree, 92 degree, 94
Degree, 96 degree, 98 degree, 100 degree or any arbitrary value between the two, the field angle of optical transmitting set 22 is 80 degree, 82 degree, 84 degree,
86 degree, 90 degree, 92 degree, 94 degree, 96 degree, 98 degree, 100 degree or any arbitrary value between the two, this is not restricted.
Please continue to refer to Fig. 1 and Fig. 2, under normal circumstances, the laser pulse that the optical transmitting set 22 of adjacent orientation emits is mutual
Between be easy to cause interference, and the laser pulse that the optical transmitting set 22 in opposition orientation emits do not easily cause between each other it is dry
It disturbs.Therefore, it is interfered with each other in order to avoid this, improves the accuracy of the depth information of acquisition, the optical transmitting set 22 of adjacent orientation
Can be corresponding with time division emission laser pulse, the optical receiver 24 of adjacent orientation also different-time exposure, to obtain panorama depth
Image.Specifically, the optical transmitting set 22b time division emission laser pulse of the optical transmitting set 22a of first orientation and second orientation, first
The optical transmitting set 22a in orientation and the optical transmitting set 22d time division emission laser pulse of fourth orientation, the optical transmitting set in third orientation
The optical transmitting set 22b time division emission laser pulse of 22c and second orientation, the optical transmitting set 22c in third orientation and fourth orientation
Optical transmitting set 22d time division emission laser pulse.And the optical transmitting set 22a of first orientation and the optical transmitting set 22c in third orientation can
To emit laser pulse or time division emission laser pulse simultaneously;The optical transmitting set 22b of second orientation and the light of fourth orientation are sent out
Emitter 22d can emit laser pulse or time division emission laser pulse simultaneously, and this is not restricted.Similarly, first orientation
The optical receiver 24b different-time exposure of optical receiver 24a and second orientation, the optical receiver 24a of first orientation and fourth orientation
Optical receiver 24d different-time exposure, the optical receiver 24c in third orientation and the optical receiver 24b different-time exposure of second orientation, third
The optical receiver 24c in orientation and the optical receiver 24d different-time exposure of fourth orientation.And the optical receiver 24a of first orientation and
The optical receiver 24c of Three-bearing can expose simultaneously or different-time exposure;The optical receiver 24b and fourth orientation of second orientation
Optical receiver 24d can expose simultaneously or different-time exposure, this is not restricted.
Preferably, the 22 time division emission laser pulse of optical transmitting set in multiple flight time components 20, corresponding, it is more
Optical receiver 24 in a flight time component 20 also different-time exposure, to obtain panoramic range image.Wherein, when any one flies
When optical receiver 24 in row time component 20 exposes, the optical transmitting set 22 in other flight time components 20 is turned off.Often
A optical receiver 24 is only capable of receiving the laser pulse that corresponding optical transmitting set 22 emits, without receiving remaining optical transmitting set
The laser pulse of 22 transmittings guarantees the accuracy of received laser pulse so as to preferably avoid above-mentioned interference problem.
Specifically, Fig. 3 and Fig. 4 are please referred to, in one embodiment, multiple light emittings in multiple flight time components 20
Device 22 successively connects and emits incessantly laser pulse, the time for exposure of the optical receiver 24 in each flight time component 20
In the time range of the optical transmitting set 22 transmitting laser pulse.Optical transmitting set 22a, optical transmitting set 22b, optical transmitting set 22c
With optical transmitting set 22d time division emission laser pulse, and optical transmitting set 22a stop transmitting laser pulse at the time of, optical transmitting set
22b emits laser pulse from this moment immediately, at the time of optical transmitting set 22b stops transmitting laser pulse, optical transmitting set
22c emits laser pulse from this moment immediately, at the time of optical transmitting set 22c stops transmitting laser pulse, optical transmitting set
22d emits laser pulse from this moment immediately, at the time of optical transmitting set 22d stops transmitting laser pulse, optical transmitting set
22a emits laser pulse from this moment immediately.Optical transmitting set 22a, optical transmitting set 22b, optical transmitting set 22c and optical transmitting set
The time of 22d transmitting laser pulse collectively constitutes an alternate cycle T.At this point, optical receiver 24a, optical receiver 24b, light connect
The Exposure mode for receiving device 24c and optical receiver 24d may include following two:
(1) optical receiver 24a, optical receiver 24b, optical receiver 24c and optical receiver 24d are successively connected and incessantly
Exposure.Specifically, the time for exposure of four optical receivers 24 emits the time of laser pulse with corresponding optical transmitting set 22 respectively
Unanimously.As shown in figure 3, optical receiver 24a, optical receiver 24b, optical receiver 24c and optical receiver 24d successively rapid alternation.
One at the beginning of the exposure start time of optical receiver 24a and the optical transmitting set 22a transmitting laser pulse of current alternate cycle T
It causes, when the exposure cut-off time of optical receiver 24a and the optical transmitting set 22a of current alternate cycle T emit the cut-off of laser pulse
It carves consistent;The exposure start time of optical receiver 24b opens with the optical transmitting set 22b transmitting laser pulse of current alternate cycle T
Beginning, the moment is consistent, and the exposure cut-off time of optical receiver 24b and the optical transmitting set 22b of current alternate cycle T emit laser pulse
Cut-off time it is consistent;The exposure start time of optical receiver 24c and the optical transmitting set 22c of current alternate cycle T emit laser
Consistent at the beginning of pulse, the exposure cut-off time of optical receiver 24c and the optical transmitting set 22c of current alternate cycle T emit
The cut-off time of laser pulse is consistent;The optical transmitting set 22d of the exposure start time and current alternate cycle T of optical receiver 24d
It is consistent at the beginning of transmitting laser pulse, the optical transmitting set of the exposure cut-off time and current alternate cycle T of optical receiver 24d
The cut-off time that 22d emits laser pulse is consistent.At this point, optical receiver 24a is only capable of receiving the laser of optical transmitting set 22a transmitting
Pulse, and do not receive the laser pulse of optical transmitting set 22b, optical transmitting set 22c and optical transmitting set 22d transmitting;Optical receiver 24b
It is only capable of receiving the laser pulse of optical transmitting set 22b transmitting, and does not receive optical transmitting set 22a, optical transmitting set 22c and light emitting
The laser pulse of device 22d transmitting;Optical receiver 24c is only capable of receiving the laser pulse of optical transmitting set 22c transmitting, and does not receive
The laser pulse of optical transmitting set 22a, optical transmitting set 22b and optical transmitting set 22d transmitting;Optical receiver 24d is only capable of receiving light hair
Emitter 22d transmitting laser pulse, and do not receive optical transmitting set 22a, optical transmitting set 22b and optical transmitting set 22c transmitting swash
Light pulse.Optical receiver 24a, optical receiver 24b, optical receiver 24c and optical receiver 24d are successively connected and are exposed incessantly
Control mode in, optical receiver 24a and optical transmitting set 22a synchronously control, optical receiver 24b it is synchronous with optical transmitting set 22b control
System, optical receiver 24c and optical transmitting set 22c synchronously control, optical receiver 24d and optical transmitting set 22d synchronously control, control logic
It is relatively simple.
(2) as shown in figure 4, optical receiver 24a, optical receiver 24b, optical receiver 24c and optical receiver 24d are successively connected
And interval predetermined time exposure.Wherein, the time for exposure of at least one optical receiver 24 is less than corresponding optical transmitting set 22 and sends out
Penetrate the time of laser pulse.Specifically, as shown in Fig. 4 (a), in one example, optical receiver 24a, optical receiver 24b, light
Receiver 24c and optical receiver 24d successively rapid alternation.The time for exposure of optical receiver 24a is less than optical transmitting set 22a transmitting and swashs
The time of light pulse, the time for exposure of optical receiver 24b are equal to the time of optical transmitting set 22b transmitting laser pulse, optical receiver
The time for exposure of 24c is less than the time of optical transmitting set 22c transmitting laser pulse, and the time for exposure of optical receiver 24d is sent out equal to light
The time of emitter 22d transmitting laser pulse.The exposure start time of optical receiver 24a is greater than the light emitting of current alternate cycle T
At the beginning of device 22a emits laser pulse, optical transmitting set 22a transmitting of the exposure cut-off time less than current alternate cycle T swashs
The cut-off time of light pulse;The exposure start time and exposure cut-off time of optical receiver 24b is respectively with current alternate cycle T's
It is consistent with cut-off time at the beginning of optical transmitting set 22b transmitting laser pulse;The exposure start time of optical receiver 24c is greater than
At the beginning of the optical transmitting set 22c transmitting laser pulse of current alternate cycle T, exposure cut-off time is less than current alternate cycle
The cut-off time of the optical transmitting set 22c transmitting laser pulse of T;The exposure start time and exposure cut-off time of optical receiver 24d
At the beginning of emitting laser pulse with the optical transmitting set 22d of current alternate cycle T respectively and cut-off time is consistent.Optical receiver
24a, which exposes, is spaced predetermined time Δ between cut-off time and the exposure start time of the optical receiver 24b of current alternate cycle T
It is spaced between the exposure start time of the optical receiver 24c of t1, optical receiver 24b exposure cut-off time and current alternate cycle T
When the exposure of the optical receiver 24d of predetermined time Δ t2, optical receiver 24c exposure cut-off time and current alternate cycle T start
Predetermined time Δ t3, the optical receiver 24a of the exposure cut-off time and next alternate cycle T of optical receiver 24d are spaced between quarter
Exposure start time between be spaced predetermined time Δ t4, Δ t1, Δ t2, Δ t3 and Δ t4 can all it is equal, or all
It differs or part is equal, partially differs.Optical receiver 24a is only capable of receiving the laser pulse of optical transmitting set 22a transmitting, light
Receiver 24b is only capable of receiving the laser pulse of optical transmitting set 22b transmitting, and optical receiver 24c is only capable of receiving optical transmitting set 22c
The laser pulse of transmitting, optical receiver 24d are only capable of receiving the laser pulse of optical transmitting set 22d transmitting.As shown in Fig. 4 (b),
In another example, optical receiver 24a, optical receiver 24b, optical receiver 24c and optical receiver 24d successively rapid alternation.Light
The time for exposure of receiver 24a is less than the time of optical transmitting set 22a transmitting laser pulse, and the time for exposure of optical receiver 24b is small
In the time of optical transmitting set 22b transmitting laser pulse, the time for exposure of optical receiver 24c is less than optical transmitting set 22c and emits laser
The time of pulse, the time for exposure of optical receiver 24d are less than the time of optical transmitting set 22d transmitting laser pulse.Optical receiver 24a
Exposure start time greater than current alternate cycle T optical transmitting set 22a transmitting laser pulse at the beginning of, exposure cut-off when
Carve the cut-off time of the optical transmitting set 22a transmitting laser pulse less than current alternate cycle T;The exposure of optical receiver 24b starts
At the beginning of optical transmitting set 22b transmitting laser pulse of the moment greater than current alternate cycle T, exposure cut-off time is less than current
The cut-off time of the optical transmitting set 22b transmitting laser pulse of alternate cycle T;The exposure start time of optical receiver 24c is greater than and works as
At the beginning of the optical transmitting set 22c transmitting laser pulse of preceding alternate cycle T, exposure cut-off time is less than current alternate cycle T
Optical transmitting set 22c transmitting laser pulse cut-off time;The exposure start time of optical receiver 24d is greater than current alternate cycle
At the beginning of the optical transmitting set 22d transmitting laser pulse of T, exposure cut-off time is less than the optical transmitting set of current alternate cycle T
The cut-off time of 22d transmitting laser pulse.The optical receiver 24b of optical receiver 24a exposure cut-off time and current alternate cycle T
Exposure start time between be spaced predetermined time Δ t1, optical receiver 24b expose cut-off time and current alternate cycle T light
Between the exposure start time of receiver 24c be spaced predetermined time Δ t2, optical receiver 24c expose cut-off time with currently replace
Predetermined time Δ t3 is spaced between the exposure start time of the optical receiver 24d of cycle T, when the exposure of optical receiver 24d ends
It carves and is spaced predetermined time Δ t4, Δ t1, Δ t2, Δ between the exposure start time of the optical receiver 24a of next alternate cycle T
T3 and Δ t4 can be all equal, perhaps all differs or part is equal, partially differs.Optical receiver 24a is only capable of receiving
The laser pulse of optical transmitting set 22a transmitting, optical receiver 24b are only capable of receiving the laser pulse of optical transmitting set 22b transmitting, and light connects
It receives device 24c to be only capable of receiving the laser pulse of optical transmitting set 22c transmitting, optical receiver 24d is only capable of receiving optical transmitting set 22d hair
The laser pulse penetrated.Optical receiver 24a, optical receiver 24b, optical receiver 24c and optical receiver 24d are successively connected and are spaced pre-
In the control mode of exposure of fixing time, the time for exposure of at least one optical receiver 24 emits compared to corresponding optical transmitting set 22
The time of laser pulse is shorter, is conducive to the power consumption for reducing electronic equipment 100.
Multiple optical transmitting sets 22 in multiple flight time components 20 successively connect and emit laser pulse incessantly
In control mode, the frame per second that flight time component 20 acquires initial depth image is higher, is suitable for acquisition initial depth image
The more demanding scene of frame per second.
Fig. 5 and Fig. 6 are please referred to, in another embodiment, multiple optical transmitting sets 22 in multiple flight time components 20
It successively connects and is spaced predetermined time transmitting laser pulse, i.e. optical transmitting set 22a, optical transmitting set 22b, optical transmitting set 22c and light
Transmitter 22d alternate emission laser pulse, optical transmitting set 22a emit in cut-off time and the current alternate cycle T of laser pulse
Optical transmitting set 22b transmitting laser pulse at the beginning of between be spaced predetermined time Δ t5, optical transmitting set 22b emit laser arteries and veins
Interval is predetermined between at the beginning of optical transmitting set 22c transmitting laser pulse in the cut-off time of punching and current alternate cycle T
Time Δ t6, optical transmitting set 22c emit the cut-off time of laser pulse and the optical transmitting set 22d in current alternate cycle T emits
Predetermined time Δ t7 is spaced between at the beginning of laser pulse, optical transmitting set 22d emits the cut-off time of laser pulse under
Predetermined time Δ t8, Δ t5, Δ are spaced between at the beginning of optical transmitting set 22a transmitting laser pulse in one alternate cycle T
T6, Δ t7 and Δ t8 can be all equal, perhaps all differ or part is equal, partially differs, wherein optical transmitting set
22a, optical transmitting set 22b, optical transmitting set 22c and optical transmitting set 22d transmitting laser pulse time and predetermined time Δ t5,
Predetermined time Δ t6, predetermined time Δ t7 and predetermined time Δ t8 collectively constitute an alternate cycle T.At this point, optical receiver
24a, optical receiver 24b, optical receiver 24c and optical receiver 24d Exposure mode may include following two:
(1) optical receiver 24a, optical receiver 24b, optical receiver 24c and optical receiver 24d are successively connected and incessantly
Exposure.Specifically, as shown in Fig. 5 (a), in one example, the exposure start time of optical receiver 24a and current alternate cycle
It is consistent at the beginning of the optical transmitting set 22a transmitting laser pulse of T, expose the light emitting of cut-off time and current alternate cycle T
The cut-off time that device 22a emits laser pulse is consistent;The light of the exposure start time and current alternate cycle T of optical receiver 24b
The cut-off time that transmitter 22a emits laser pulse is consistent, exposes the optical transmitting set 22c of cut-off time and current alternate cycle T
It is consistent at the beginning of transmitting laser pulse;The optical transmitting set of the exposure start time and current alternate cycle T of optical receiver 24c
Consistent at the beginning of 22c transmitting laser pulse, the optical transmitting set 22c transmitting of exposure cut-off time and current alternate cycle T are sharp
The cut-off time of light pulse is consistent;The exposure start time of optical receiver 24d and the optical transmitting set 22c of current alternate cycle T are sent out
The cut-off time for penetrating laser pulse is consistent, and the optical transmitting set 22a of exposure cut-off time and next alternate cycle T emit laser pulse
At the beginning of it is consistent.Optical receiver 24a is only capable of receiving the laser pulse of optical transmitting set 22a transmitting, and optical receiver 24b is only capable of
The laser pulse of optical transmitting set 22b transmitting is received, optical receiver 24c is only capable of receiving the laser arteries and veins of optical transmitting set 22c transmitting
Punching, optical receiver 24d are only capable of receiving the laser pulse of optical transmitting set 22d transmitting.As shown in Fig. 5 (b), in another example
In, at the beginning of the exposure start time of optical receiver 24a and the optical transmitting set 22a transmitting laser pulse of current alternate cycle T
It carves unanimously, it is consistent at the beginning of the optical transmitting set 22b transmitting laser pulse of exposure cut-off time and current alternate cycle T;Light
One at the beginning of the exposure start time of receiver 24b and the optical transmitting set 22b transmitting laser pulse of current alternate cycle T
It causes, it is consistent at the beginning of the optical transmitting set 22c transmitting laser pulse of exposure cut-off time and current alternate cycle T;Light-receiving
It is consistent at the beginning of the exposure start time of device 24c and the optical transmitting set 22c transmitting laser pulse of current alternate cycle T, it exposes
It is consistent at the beginning of the optical transmitting set 22d of light cut-off time and current alternate cycle T transmitting laser pulse;Optical receiver 24d
Exposure start time and current alternate cycle T optical transmitting set 22d transmitting laser pulse at the beginning of consistent, exposure cut-off
It is consistent at the beginning of moment and next alternate cycle T optical transmitting set 22a transmitting laser pulse.Optical receiver 24a is only capable of receiving
The laser pulse emitted to optical transmitting set 22a, optical receiver 24b are only capable of receiving the laser pulse of optical transmitting set 22b transmitting, light
Receiver 24c is only capable of receiving the laser pulse of optical transmitting set 22c transmitting, and optical receiver 24d is only capable of receiving optical transmitting set 22d
The laser pulse of transmitting.
(2) optical receiver 24a, optical receiver 24b, optical receiver 24c and optical receiver 24d are successively connected and are spaced predetermined
Time Exposure.Specifically, as shown in Fig. 6 (a), in one example, the exposure start time of optical receiver 24a and exposure end
At the beginning of moment emits laser pulse with the optical transmitting set 22a of current alternate cycle T respectively and cut-off time is consistent;Light connects
The exposure start time and exposure cut-off time for receiving device 24b emit laser arteries and veins with the optical transmitting set 22b of current alternate cycle T respectively
It is consistent with cut-off time at the beginning of punching;Exposure start time of optical receiver 24c and exposure cut-off time are respectively and currently
It is consistent with cut-off time at the beginning of the optical transmitting set 22c transmitting laser pulse of alternate cycle T;The exposure of optical receiver 24d
At the beginning of start time and exposure cut-off time emit laser pulse with the optical transmitting set 22d of current alternate cycle T respectively
It is consistent with cut-off time.The exposure cut-off time of optical receiver 24a and the exposure of the optical receiver 24b of current alternate cycle T are opened
The optical receiver 24c of the exposure cut-off time and current alternate cycle T of time at intervals beginning, Δ predetermined time t5, optical receiver 24b
The exposure cut-off time of exposure start time interval predetermined time Δ t6, optical receiver 24c and the light of current alternate cycle T connect
The exposure cut-off time of exposure start time interval predetermined time the Δ t7, optical receiver 24d of receipts device 24d and next alternate cycle
The exposure start time interval predetermined time Δ t8 of the optical receiver 24a of T.Δ t5, Δ t6, Δ t7 and Δ t8 can whole phases
Deng, perhaps all differ or part it is equal, part differ.Optical receiver 24a is only capable of receiving optical transmitting set 22a transmitting
Laser pulse, optical receiver 24b are only capable of receiving the laser pulse of optical transmitting set 22b transmitting, and optical receiver 24c is only capable of receiving
The laser pulse of optical transmitting set 22c transmitting, optical receiver 24d are only capable of receiving the laser pulse of optical transmitting set 22d transmitting.Such as figure
Shown in 6 (b), in another example, the exposure start time of optical receiver 24a with expose cut-off time respectively with currently replace
It is consistent with cut-off time at the beginning of the optical transmitting set 22a transmitting laser pulse of cycle T;The exposure of optical receiver 24b starts
At the beginning of optical transmitting set 22b transmitting laser pulse of the moment less than current alternate cycle T, exposure cut-off time is less than current
At the beginning of the optical transmitting set 22c transmitting laser pulse of alternate cycle T;The exposure start time of optical receiver 24c and exposure
At the beginning of cut-off time emits laser pulse with the optical transmitting set 22c of current alternate cycle T respectively and cut-off time is consistent;
At the beginning of optical transmitting set 22d transmitting laser pulse of the exposure start time of optical receiver 24d less than current alternate cycle T
It carves, at the beginning of optical transmitting set 22a transmitting laser pulse of the exposure cut-off time less than next alternate cycle T.Optical receiver
The exposure start time interval predetermined time Δ t9 of the optical receiver 24b of the exposure cut-off time and current alternate cycle T of 24a,
The pre- timing in exposure start time interval of the optical receiver 24c of the exposure cut-off time and current alternate cycle T of optical receiver 24b
Between Δ t10, optical receiver 24c exposure cut-off time and current alternate cycle T optical receiver 24d exposure start time between
Every predetermined time Δ t11, the exposure cut-off time of optical receiver 24d and the exposure of the optical receiver 24a of next alternate cycle T are opened
Time at intervals beginning, Δ predetermined time t12.Δ t9, Δ t10, Δ t11 and Δ t12 can all it is equal, perhaps all differ or
Part is equal, part is differed.Optical receiver 24a is only capable of receiving the laser pulse of optical transmitting set 22a transmitting, optical receiver 24b
It is only capable of receiving the laser pulse of optical transmitting set 22b transmitting, optical receiver 24c is only capable of receiving swashing for optical transmitting set 22c transmitting
Light pulse, optical receiver 24d are only capable of receiving the laser pulse of optical transmitting set 22d transmitting.As shown in Fig. 6 (c), in another example
In son, optical transmitting set 22d transmitting laser pulse of the exposure start time of optical receiver 24a greater than previous alternate cycle T is cut
The only moment, at the beginning of optical transmitting set 22b of the exposure cut-off time less than current alternate cycle T emits laser pulse;Light connects
The exposure cut-off time for receiving optical receiver 24d of the exposure start time greater than previous alternate cycle T of device 24a, when exposure ends
Carve the exposure start time of the optical receiver 24b less than current alternate cycle T.The exposure start time of optical receiver 24b is greater than
The cut-off time of the optical transmitting set 22a transmitting laser pulse of current alternate cycle T, exposure cut-off time are less than current alternate cycle
At the beginning of the optical transmitting set 22c transmitting laser pulse of T;The exposure start time of optical receiver 24b is greater than current alternately week
The exposure cut-off time of the optical receiver 24a of phase T, the exposure of optical receiver 24c of the exposure cut-off time less than current alternate cycle T
Light start time.Optical transmitting set 22b of the exposure start time greater than current alternate cycle T of optical receiver 24c emits laser arteries and veins
The cut-off time of punching, at the beginning of optical transmitting set 22d of the exposure cut-off time less than current alternate cycle T emits laser pulse
It carves;The exposure cut-off time of optical receiver 24b of the exposure start time greater than current alternate cycle T of optical receiver 24c, exposure
The exposure start time of optical receiver 24d of the cut-off time less than current alternate cycle T.When the exposure of optical receiver 24d starts
The cut-off time of the optical transmitting set 22c transmitting laser pulse greater than current alternate cycle T is carved, exposure cut-off time is less than next friendship
At the beginning of optical transmitting set 22a transmitting laser pulse for cycle T;The exposure start time of optical receiver 24d is greater than current
The exposure cut-off time of the optical receiver 24c of alternate cycle T, exposure cut-off time are less than the optical receiver of next alternate cycle T
The exposure start time of 24a.The exposure of the optical receiver 24b of the exposure cut-off time and current alternate cycle T of optical receiver 24a
The optical receiver of the exposure cut-off time and current alternate cycle T of start time interval predetermined time Δ t9, optical receiver 24b
The exposure cut-off time of exposure start time interval predetermined time the Δ t10, optical receiver 24c of 24c are with current alternate cycle T's
The exposure cut-off time of exposure start time interval predetermined time the Δ t11, optical receiver 24d of optical receiver 24d and next friendship
For the exposure start time interval predetermined time Δ t12 of the optical receiver 24a of cycle T.Δ t9, Δ t10, Δ t11 and Δ t12 can
With all equal, perhaps all differ or part is equal, partially differs.
Multiple optical transmitting sets 22 in multiple flight time components 20 successively connect and are spaced predetermined time transmitting laser arteries and veins
In the control mode of punching, the frame per second that flight time component 20 acquires initial depth image is lower, is suitable for acquisition initial depth
The frame per second of image requires lower scene, while being conducive to reduce the power consumption of electronic equipment 100.
In addition, as previously mentioned, working as the light-receiving of the 22 time division emission laser pulse of optical transmitting set of adjacent orientation, adjacent orientation
When 24 different-time exposure of device, the optical transmitting set 22a of first orientation and the optical transmitting set 22c in third orientation can emit laser arteries and veins simultaneously
Punching, the optical receiver 24a of first orientation can expose simultaneously with the optical receiver 24c in third orientation, the light emitting of second orientation
Device 22b and the optical transmitting set 22d of fourth orientation can emit laser pulse, the optical receiver 24b of second orientation and the 4th simultaneously
The optical receiver 24d in orientation can expose simultaneously.At this time referring to Fig. 7, optical transmitting set 22a emits the time of laser pulse (i.e.
Optical transmitting set 22c transmitting laser pulse time) and optical transmitting set 22b transmitting laser pulse time (i.e. optical transmitting set 22d send out
Penetrate the time of laser pulse) it collectively constitutes an alternate cycle T and (may also comprise aforementioned each pre- timing in an alternate cycle T
Between Δ t).Optical transmitting set 22a, optical transmitting set 22b, optical transmitting set 22c and optical transmitting set 22d can treat as two optical transmitting sets 22
It is controlled, optical receiver 24a, optical receiver 24b, optical receiver 24c and optical receiver 24d can also connect as two light
Device 24 is received to be controlled, explanation is not developed in details referring to 3~Fig. 6 of earlier figures and its corresponding explanation in control mode herein.
Fig. 1 and Fig. 2 are please referred to, CCD camera assembly 30 is arranged on ontology 10.The quantity of CCD camera assembly 30 can be more
It is a, the corresponding flight time component 20 of each CCD camera assembly 30.For example, when the quantity of flight time component 20 is four
When, the quantity of CCD camera assembly 30 is also four, and four CCD camera assemblies 30 are separately positioned on first orientation, second orientation,
Three-bearing and fourth orientation.
Multiple CCD camera assemblies 30 are connect with application processor 50.Each CCD camera assembly 30 is for acquiring target subject
Scene image and export to application processor 50.In present embodiment, four CCD camera assemblies 30 are respectively used to acquisition first
The scene image of the target subject in orientation, the scene image of the target subject of second orientation, third orientation target subject field
Scape image, fourth orientation target subject scene image and exported respectively to application processor 50.It is appreciated that each camera shooting
Head assembly 30 and the field angle of optical receiver 24 of corresponding flight time component 20 are identical or approximately uniform, so that each field
Scape image can preferably be matched with corresponding initial depth image.
CCD camera assembly 30 can be visible image capturing head 32 or infrared pick-up head 34.When CCD camera assembly 30
When for visible image capturing head 32, scene image is visible images;When CCD camera assembly 30 is infrared pick-up head 34, scene
Image is infrared light image.
Referring to Fig. 2, microprocessor 40 can be processing chip.The quantity of microprocessor 40 can be multiple, Mei Gewei
Processor 40 corresponds to a flight time component 20.For example, in present embodiment, when the quantity of flight time component 20 is four
A, the quantity of microprocessor 40 is also four.Each microprocessor 40 and the optical transmitting set in corresponding flight time component 20
22 and optical receiver 24 be all connected with.Each microprocessor 40 can drive the corresponding transmitting of optical transmitting set 22 to swash by driving circuit
Light, and realize that multiple optical transmitting sets 22 emit laser simultaneously by the control of multi-microprocessor 40.Each microprocessor 40 is also
The clock information of laser pulse is received so that optical receiver 24 exposes, and passes through more for providing to corresponding optical receiver 24
The control of a microprocessor 40 exposes while realizing multiple optical receiver 24.Each microprocessor 40 is also used to according to corresponding
The laser pulse and the received laser pulse of optical receiver 24 that optical transmitting set 22 emits obtain initial depth image.For example, four
Microprocessor 40 obtains just according to the laser pulse of optical transmitting set 22a transmitting and the received laser pulse of optical receiver 24a respectively
Beginning depth image P1, it is obtained initially according to the laser pulse and the received laser pulse of optical receiver 24b of optical transmitting set 22b transmitting
Depth image P2, initial depth is obtained according to the laser pulse and the received laser pulse of optical receiver 24c of optical transmitting set 22c transmitting
Degree image P3, initial depth is obtained according to the laser pulse and the received laser pulse of optical receiver 24d of optical transmitting set 22d transmitting
Image P4 (as shown in the upper part of Fig. 9).Each microprocessor 40 can also carry out tiled, distortion school to initial depth image
Just, self calibration scheduling algorithm is handled, to improve the quality of initial depth image.
In another embodiment, as shown in figure 8, the quantity of microprocessor 40 may be one.At this point, microprocessor
40 simultaneously in multiple flight time components 20 optical transmitting set 22 and optical receiver 24 connect.Specifically, microprocessor 40 is same
When with optical transmitting set 22a, optical receiver 24a, optical transmitting set 22b, optical receiver 24b, optical transmitting set 22c, optical receiver 24c,
Optical transmitting set 22d is connected with optical receiver 24d.One microprocessor 40 can be distinguished with the multiple and different driving circuit of Time-sharing control
Drive multiple optical transmitting sets 22 to emit laser pulses, can be provided with timesharing to multiple optical receivers 24 receive laser pulse when
Clock information makes multiple 24 different-time exposures of optical receiver, and successively according to the laser pulse and multiple of multiple optical transmitting sets 22 transmitting
The received laser pulse of optical receiver 24 obtains multiple initial depth images.For example, microprocessor 40 is first according to optical transmitting set 22a
The laser pulse and the received laser pulse of optical receiver 24a of transmitting obtain initial depth image P1, further according to optical transmitting set 22b
The laser pulse and the received laser pulse of optical receiver 24b of transmitting obtain initial depth image P2, further according to optical transmitting set 22c
The laser pulse and the received laser pulse of optical receiver 24c of transmitting obtain initial depth image P3, finally according to optical transmitting set
The laser pulse and the received laser pulse of optical receiver 24d of 22d transmitting obtain (the upper part of such as Fig. 9 initial depth image P4
It is shown).For a microprocessor 40, processing speed faster, is delayed smaller multi-microprocessor 40.A but micro- place
Device 40 is managed for multi-microprocessor 40, is conducive to the volume for reducing electronic equipment 100, reduction electronics is also helped and sets
Standby 100 manufacturing cost.
When microprocessor 40 is multiple, multi-microprocessor 40 is connect with application processor 50, by initial depth
Image transmitting is to application processor 50.When microprocessor 40 is one, a microprocessor 40 is connect with application processor 50,
With by initial depth image transmitting to application processor 50.In one embodiment, microprocessor 40 can pass through mobile Enterprise Administration Office
Reason device interface (Mobile Industry Processor Interface, MIPI) is connect with application processor 50, specifically,
Microprocessor 40 passes through the credible performing environment (Trusted of mobile industry processor interface and application processor 50
Execution Environment, TEE) connection, the data (initial depth image) in microprocessor 40 are transmitted directly to
In credible performing environment, to improve the safety of the information in electronic equipment 100.Wherein, the code in credible performing environment and
Region of memory is controlled by access control unit, cannot be by untrusted performing environment (Rich Execution
Environment, REE) in program accessed, credible performing environment and untrusted performing environment can be formed at
It manages in device 50.
The system that application processor 50 can be used as electronic equipment 100.Application processor 50 can reset microprocessor 40,
Wake up (wake) microprocessor 40, error correction (debug) microprocessor 40 etc..Application processor 50 can also be with electronic equipment 100
Multiple electronic components connect and control multiple electronic component and run in predetermined patterns, such as application processor 50
It connect with visible image capturing head 32 and infrared pick-up head 34, is shot with controlling visible image capturing head 32 and infrared pick-up head 34
Visible images and infrared light image, and handle the visible images and infrared light image;When electronic equipment 100 includes display screen
When, application processor 50 can control display screen and show scheduled picture;Application processor 50 can be with controlling electronic devices 100
Antenna send or receive scheduled data etc..
Referring to Fig. 9, in one embodiment, application processor 50 is used for will be multiple according to the field angle of optical receiver 24
Multiple initial depth images that microprocessor 40 obtains synthesize a frame panoramic range image, or the view according to optical receiver 24
Multiple initial depth images that rink corner successively obtains a microprocessor 40 synthesize a frame panoramic range image.
Specifically, it incorporated by reference to Fig. 1, is built using transversal line as X-axis by Y-axis of longitudinal axis using the center of ontology 10 as center of circle O
Vertical rectangular coordinate system XOY, in rectangular coordinate system XOY, the visual field of optical receiver 24a (up time between 45 degree~315 degree
Needle rotation, rear same), for the visual field of optical receiver 24b between 315 degree~225 degree, the visual field of optical receiver 24c is located at 225 degree
Between~135 degree, the visual field of optical receiver 24d is between 135 degree~45 degree, then application processor 50 is according to four light-receivings
The field angle of device 24 by initial depth image P1, initial depth image P2, initial depth image P3, initial depth image P4 successively
It is spliced into the panoramic range image P1234 of 360 degree of a frame, so as to the use of depth information.
The laser pulse and the received laser pulse of optical receiver 24 that microprocessor 40 emits according to optical transmitting set 22 obtain
Initial depth image in, target subject and the optical receiver 24 in the orientation that the depth information of each pixel is corresponding orientation
The distance between.That is, the depth information of each pixel is the target subject and light-receiving of first orientation in initial depth image P1
The distance between device 24a;The depth information of each pixel is that the target subject of second orientation connects with light in initial depth image P2
Receive the distance between device 24b;The depth information of each pixel is the target subject and light in third orientation in initial depth image P3
The distance between receiver 24c;In initial depth image P4 the depth information of each pixel be fourth orientation target subject with
The distance between optical receiver 24d.It is being the panorama depth of 360 degree of a frame by multiple initial depth image mosaics in multiple orientation
During image, first have to for the depth information of each pixel in each initial depth image to be converted to unitized depth letter
Breath, unitized depth information indicate each target subject in each orientation at a distance from some base position.Depth information conversion
After unitized depth information, application processor 40 is facilitated to do the splicing of initial depth image according to unitized depth information.
Specifically, a frame of reference is selected, the frame of reference can be with the optical receiver 24 in some orientation
Image coordinate system is also possible to select other coordinate systems as the frame of reference as the frame of reference.By taking Figure 10 as an example, with
xo-yo-zoCoordinate system is benchmark coordinate system.Coordinate system x shown in Fig. 10a-ya-zaFor the image coordinate system of optical receiver 24a, sit
Mark system xb-yb-zbFor the image coordinate system of optical receiver 24b, coordinate system xc-yc-zcFor the image coordinate system of optical receiver 24c,
Coordinate system xd-yd-zdFor the image coordinate system of optical receiver 24d.Application processor 50 is according to coordinate system xa-ya-zaIt is sat with benchmark
Mark system xo-yo-zoBetween spin matrix and translation matrix the depth information of each pixel in initial depth image P1 is converted to
Unitized depth information, according to coordinate system xb-yb-zbWith frame of reference xo-yo-zoBetween spin matrix and translation matrix will
The depth information of each pixel is converted to unitized depth information in initial depth image P2, according to coordinate system xc-yc-zcWith base
Conventional coordinates xo-yo-zoBetween spin matrix and translation matrix by the depth information of each pixel in initial depth image P3 turn
It is changed to unitized depth information;According to coordinate system xd-yd-zdWith frame of reference xo-yo-zoBetween spin matrix and translation square
The depth information of each pixel in initial depth image P4 is converted to unitized depth information by battle array.
After the completion of depth information conversion, multiple initial depth images are located under a unified frame of reference, and each
Corresponding coordinate (the x of one pixel of initial depth imageo,yo,zo), then initial depth can be done by coordinate matching
The splicing of image.For example, some pixel P in initial depth image P1aCoordinate be (xo1,yo1,zo1), initial deep
Spend some pixel P in image P2bCoordinate be also (xo1,yo1,zo1), due to PaAnd PbUnder the current frame of reference
Coordinate value having the same, then pixels illustrated point PaWith pixel PbIt is actually the same point, initial depth image P1 and initial
When depth image P2 splices, pixel PaIt needs and pixel PbIt is overlapped.In this way, application processor 50 can pass through of coordinate
The splicing of multiple initial depth images is carried out with relationship, and obtains 360 degree of panoramic range image.
It should be noted that the splicing that the matching relationship based on coordinate carries out initial depth image requires initial depth image
Resolution ratio need be greater than a default resolution ratio.If being appreciated that the resolution ratio of initial depth image is lower, coordinate
(xo,yo,zo) accuracy also can be relatively low, at this point, directly being matched according to coordinate, in fact it could happen that PaPoint and PbPoint is practical
On be not overlapped, but differ an offset offset, and the value of offset be more than error bounds limit value the problem of.If image
Resolution ratio it is higher, then coordinate (xo,yo,zo) accuracy also can be relatively high, at this point, directly being matched according to coordinate, i.e.,
Make PaPoint and PbPoint is practically without coincidence, differs an offset offset, but the value of offset can also be less than bouds on error
Value will not influence too much the splicing of initial depth image that is, in the range of error permission.
It is appreciated that subsequent implementation mode can be used aforesaid way by two or more initial depth images into
Row splicing or synthesis, no longer illustrate one by one.
Multiple initial depth images can also be synthesized three-dimensional with corresponding multiple visible images by application processor 50
Scene image is watched with being shown for user.For example, multiple visible images are respectively visible images V1, visible light figure
As V2, visible images V3 and visible images V4.Then application processor 50 is respectively by initial depth image P1 and visible light figure
It synthesized as V1 synthesis, by initial depth image P2 with visible images V2, close initial depth image P3 and visible images V3
At, initial depth image P4 synthesized with visible images V4, then four images after synthesis are spliced to obtain a frame 360
The three-dimensional scene images of degree.Alternatively, application processor 50 is first by initial depth image P1, initial depth image P2, initial depth
Image P3 and initial depth image P4 splices to obtain the panoramic range image of 360 degree of a frame, and will be seen that light image V1, visible light
Image V2, visible images V3 and visible images V4 splice to obtain the panorama visible images of 360 degree of a frame;Again by panorama depth
Degree image and panorama visible images synthesize 360 degree of three-dimensional scene images.
Figure 11 is please referred to, in one embodiment, application processor 50 is more for obtaining according to multi-microprocessor 40
Multiple scene images that a initial depth image and multiple CCD camera assemblies 30 acquire identify target subject, or micro- according to one
The multiple scene images for multiple initial depth images and multiple CCD camera assemblies 30 acquisition that processor 40 successively obtains identify quilt
Take the photograph target.
Specifically, when scene image be infrared light image when, multiple infrared light images can be respectively infrared light image I1,
Infrared light image I2, infrared light image I3 and infrared light image I4.Application processor 50 respectively according to initial depth image P1 and
Infrared light image I1 identifies the target subject of first orientation, identifies second party according to initial depth image P2 and infrared light image I2
The target subject of position identifies the target subject in third orientation according to initial depth image P3 and infrared light image I3, according to initial
The target subject of depth image P4 and infrared light image I4 identification fourth orientation.It is multiple when scene image is visible images
Visible images are visible images V1, visible images V2, visible images V3 and visible images V4 respectively.Using processing
Device 50 is respectively according to the target subject of initial depth image P1 and visible images V1 identification first orientation, according to initial depth figure
It is identified as the target subject of P2 and visible images V2 identification second orientation, according to initial depth image P3 and visible images V3
The target subject in third orientation, the target subject that fourth orientation is identified according to initial depth image P4 and visible images V4.
When identifying target subject is to carry out recognition of face, application processor 50 is using infrared light image as scene image
It is higher to carry out recognition of face accuracy.Application processor 50 carries out recognition of face according to initial depth image and infrared light image
Process can be as follows:
Firstly, carrying out Face datection according to infrared light image determines target human face region.Since infrared light image includes
The detailed information of scene can carry out Face datection according to infrared light image, to detect after getting infrared light image
It whether include out face in infrared light image.If in infrared light image including face, extract in infrared light image where face
Target human face region.
Then, In vivo detection processing is carried out to target human face region according to initial depth image.Due to each initial depth
Image and infrared light image are corresponding, include the depth information of corresponding infrared light image in initial depth image, therefore,
Depth information corresponding with target human face region can be obtained according to initial depth image.Further, since living body faces are
Three-dimensional, and the face of the display such as picture, screen is then plane, it therefore, can be according to the target human face region of acquisition
Depth information judge that target human face region is three-dimensional or plane, to carry out In vivo detection to target human face region.
If In vivo detection success, obtains the corresponding target face property parameters of target human face region, and according to target person
Face property parameters carry out face matching treatment to the target human face region in infrared light image, obtain face matching result.Target
Face character parameter refers to the parameter that can characterize the attribute of target face, can be to target person according to target face property parameters
Face carries out identification and matching treatment.Target face property parameters include but is not limited to be face deflection angle, face luminance parameter,
Face parameter, skin quality parameter, geometrical characteristic parameter etc..Electronic equipment 100 can be stored in advance joins for matched face character
Number.After getting target face property parameters, so that it may by target face property parameters and pre-stored face character
Parameter is compared.If target face property parameters are matched with pre-stored face character parameter, recognition of face passes through.
It should be pointed out that application processor 50 carries out the tool of recognition of face according to initial depth image and infrared light image
Body process is not limited to this, such as application processor 50 can also detect facial contour according to initial depth visual aids, to mention
High recognition of face precision etc..Application processor 50 according to initial depth image and visible images carry out the process of recognition of face with
Application processor 50 is similar with the infrared light image progress process of recognition of face according to initial depth image, no longer separately explains herein
It states.
Figure 11 and Figure 12 are please referred to, application processor 50 is also used to according to multiple initial depth images and multiple scene figures
As identification target subject failure when, at least two microprocessors 40 are obtained according to the field angle of optical receiver 24 at least two
Initial depth image synthesizes a frame and merges depth image, at least two scene figures that at least two CCD camera assemblies 30 are acquired
Merge scene image as synthesizing a frame, and identifies target subject according to merging depth image and merging scene image;Alternatively, answering
It is also used to processor 50 when according to multiple initial depth images and the identification target subject failure of multiple scene images, according to light
At least two initial depth images that the field angle of receiver 24 successively obtains a microprocessor 40 synthesize frame merging
At least two scene images that at least two CCD camera assemblies 30 acquire are synthesized a frame and merge scene image by depth image,
And target subject is identified according to merging depth image and merging scene image.
Specifically, in embodiment shown in Figure 11 and Figure 12, due to the optical receiver 24 of each flight time component 20
Field angle is limited, it is understood that there may be the half of face be located at initial depth image P2, the other half be located at the feelings of initial depth image P3
Initial depth image P2 and initial depth image P3 are synthesized a frame and merge depth image P23 by shape, application processor 50, and right
Infrared light image I2 and infrared light image I3 (or visible images V2 and visible images V3) should be synthesized to a frame and merge field
Scape image I23 (or V23), to identify mesh shot according to merging depth image P23 and merging scene image I23 (or V23) again
Mark.
It is appreciated that application processor 50 can be with when target subject is distributed in more initial depth images simultaneously
More initial depth images (corresponding different direction) are synthesized into a frame and merge depth image, and is corresponding by more infrared lights
Image (corresponding different direction) or visible images (corresponding different direction) synthesize a frame and merge scene image, to re-recognize
Target subject.
Figure 13 and Figure 14 are please referred to, in one embodiment, application processor 50 is used for according to (each flight time component
20 is corresponding) multiple initial depth images judge that the distance between target subject and electronic equipment 100 change.
Specifically, each optical transmitting set 22 can repeatedly emit laser, and accordingly, each optical receiver 24 can repeatedly expose
Light.When the number of microprocessor 40 is multiple, each microprocessor 40 handles swashing for corresponding optical transmitting set more than 22 times transmittings
More than 24 received laser pulses of light pulse and optical receiver obtain multiple initial depth images;When the number of microprocessor 40 is
At one, a microprocessor 40 successively handles the laser pulse and multiple optical receivers 24 of multiple optical transmitting sets more than 22 times transmittings
Multiple received laser pulse is to obtain multiple initial depth images.
For example, emitting laser pulse at the first moment t1 optical transmitting set 22a, optical receiver 24a receives laser pulse, the
Two moment t2 optical transmitting set 22b emitted laser pulse, and optical receiver 24b receives laser pulse, in third moment t3 optical transmitting set
22c emits laser pulse, and optical receiver 24c receives laser pulse, emits laser pulse at the 4th moment t4 optical transmitting set 22d,
Optical receiver 24d receives laser pulse, and (the first moment t1, the second moment t2, the first moment t3 and the 4th moment t4 are located at same
In alternate cycle T), the correspondence of multi-microprocessor 40 obtains initial depth image P11, initial depth image P21, initial depth figure
As P31, initial depth image P41, alternatively, a microprocessor 40 successively obtains initial depth image P11, initial depth image
P21, initial depth image P31, initial depth image P41;Emit laser pulse at the 5th moment t5 optical transmitting set 22a, light connects
It receives device 24a and receives laser pulse, emit laser pulse at the 6th moment t6 optical transmitting set 22b, optical receiver 24b receives laser arteries and veins
Punching emits laser pulse at the 7th moment t7 optical transmitting set 22c, and optical receiver 24c receives laser pulse, in the 8th moment t8 light
Transmitter 22d emits laser pulse, and optical receiver 24d receives laser pulse (the 5th moment t5, the 6th moment t6, the 7th moment
T7 and the 8th moment t8 is located in same alternate cycle T), multi-microprocessor 40 is corresponding to obtain initial depth image P12, initial
Depth image P22, initial depth image P32, initial depth image P42, alternatively, a microprocessor 40 successively obtains initial depth
Spend image P12, initial depth image P22, initial depth image P32, initial depth image P42.Then, application processor 50 divides
Do not judged between the target subject of first orientation and electronic equipment 100 according to initial depth image P11 and initial depth image P12
Distance change;Judge that the target subject of second orientation is set with electronics according to initial depth image P21 and initial depth image P22
Standby the distance between 100 variation;The mesh shot in third orientation is judged according to initial depth image P31 and initial depth image P32
The variation of the distance between mark and electronic equipment 100;Four directions is judged according to initial depth image P41 and initial depth image P42
The variation of the distance between the target subject of position and electronic equipment 100.
It is appreciated that due to include in initial depth image target subject depth information, application processor 50
Can be changed according to the depth information at multiple continuous moment between the target subject and electronic equipment 100 that judge corresponding orientation away from
From variation.
Figure 15 is please referred to, application processor 50 is also used to judging that distance change fails according to multiple initial depth images
When, it is synthesized according at least two initial depth images that the field angle of optical receiver 24 obtains at least two microprocessors 40
One frame merges depth image, and application processor 50 continuously performs synthesis step to obtain multiframe and continuously merge depth image, and
Merge depth image according to multiframe and judges distance change;Alternatively, application processor 50 is also used to according to each flight time group
When the corresponding multiple initial depth images of part 20 judge distance change failure, according to the field angle of optical receiver 24 by a micro- place
The corresponding at least two initial depth image of at least two flight time component 20 that reason device 40 successively obtains synthesizes frame conjunction
And depth image, application processor 50 continuously perform synthesis step to obtain multiframe and continuously merge depth image, and according to more
Frame merges depth image and judges distance change.
Specifically, in embodiment shown in figure 15, due to the field angle of the optical receiver 24 of each flight time component 20
It is limited, it is understood that there may be the half of face be located at initial depth image P21, the other half be located at the situation of initial depth image P31, answer
The initial depth image P21 of second moment t2 and third moment t3 initial depth image P31 are synthesized into a frame with processor 50
Merge depth image P231, and corresponding initial depth image P22 and the 7th moment t7 initial depth image by the 6th moment t6
P32 synthesizes a frame and merges depth image P232, then merges depth image P231 and P232 weight according to this two frame after merging
Newly judge distance change.
It is appreciated that application processor 50 can be with when target subject is distributed in more initial depth images simultaneously
More initial depth images (corresponding different direction) are synthesized into a frame and merge depth image, and are continuously held for multiple moment
The row synthesis step.
Figure 14 is please referred to, when judging that distance change reduces for distance according to multiple initial depth images, or according to more
When frame merging depth image judges that distance change reduces for distance, application processor 50 can be improved from the more of the transmission of microprocessor 40
The frame per second to judge the initial depth image of distance change is acquired in a initial depth image.Specifically, when microprocessor 40
Number be it is multiple when, application processor 50 can improve from least one microprocessor 40 transmit multiple initial depth images in
Acquire the frame per second to judge the initial depth image of distance change;When the number of microprocessor 40 is one, using processing
Device 50 can improve the initial depth acquired from multiple initial depth images that the microprocessor 40 transmits to judge distance change
Spend the frame per second of image.
It is appreciated that electronic equipment 100 can not prejudge when the distance between target subject and electronic equipment 100 reduce
The distance, which reduces, whether there is risk, and therefore, the multiple initial depths transmitted from microprocessor 40 can be improved in application processor 50
The frame per second to judge the initial depth image of distance change is acquired in degree image, with the closer concern distance change.
Specifically, when judging that the corresponding distance in some orientation reduces, the orientation is can be improved from microprocessor 40 in application processor 50
The frame per second to judge the initial depth image of distance change is acquired in multiple initial depth images of transmission.
For example, dividing in the first moment t1, the second moment t2, third moment t3 and the 4th moment t4, multi-microprocessor 40
Not Huo get or a microprocessor 40 successively obtain initial depth image P11, initial depth image P21, initial depth image
P31, initial depth image P41;In the 5th moment t5, the 6th moment t6, the 7th moment t7 and the 8th moment t8, multiple micro processs
Device 40 obtains respectively or a microprocessor 40 successively obtains initial depth image P12, initial depth image P22, initial depth
Image P32, initial depth image P42;At the 9th moment t9, the tenth moment t10, the 11st moment t11 and the 12nd moment
T12, multi-microprocessor 40 obtains respectively or a microprocessor 40 successively obtains initial depth image P13, initial depth figure
As P23, initial depth image P33, initial depth image P43;In the 13rd moment t13, the 14th moment t14, the 15th
T15 and the 16th moment t16 is carved, multi-microprocessor 40 obtains respectively or a microprocessor 40 successively obtains initial depth figure
As P14, initial depth image P24, initial depth image P34, initial depth image P44.Wherein, when the first moment t1, second
Carve t2, third moment t3 and the 4th moment t4 are located in the same alternate cycle T, when the 5th moment t5, the 6th moment t6, the 7th
Carve t7 and the 8th moment t8 be located in the same alternate cycle T, the 9th moment t9, the tenth moment t10, the 11st moment t11 and
12nd moment t12 is located in the same alternate cycle T, the 13rd moment t13, the 14th moment t14, the 15th moment t15
It is located in the same alternate cycle T with the 16th moment t16.
Under normal circumstances, the selection of application processor 50 initial depth image P11 and initial depth image P14 judges first
The variation of the distance between the target subject in orientation and electronic equipment 100;Choose initial depth image P21 and initial depth image
P24 judges that the distance between target subject and electronic equipment 100 of second orientation change;Choose initial depth image P31 and just
Beginning depth image P34 judges that the distance between target subject and the electronic equipment 100 in third orientation change;Choose initial depth figure
As P41 and initial depth image P44 judge that the distance between target subject and electronic equipment 100 of fourth orientation changes.Using
Processor 50 is to acquire a frame at interval of two frames in the frame per second of each orientation acquisition initial depth image, i.e., every three frame chooses one
Frame.
When judging that the corresponding distance of first orientation reduces according to initial depth image P11 and initial depth image P14,
Application processor 50 can then choose initial depth image P11 and initial depth image P13 judge the target subject of first orientation with
The variation of the distance between electronic equipment 100.The frame per second that application processor 50 acquires the initial depth image of first orientation becomes every
It is spaced a frame and acquires a frame, i.e., every two frame chooses a frame.And the frame per second in other orientation remains unchanged, i.e., application processor 50 still selects
Initial depth image P21 and initial depth image P24 is taken to judge distance change;Choose initial depth image P31 and initial depth
Image P34 judges distance change;It chooses initial depth image P41 and initial depth image P44 and judges distance change.
When judging that the corresponding distance of first orientation reduces according to initial depth image P11 and initial depth image P14, together
When according to initial depth image P21 and initial depth image P24 judge second orientation it is corresponding distance reduce when, using processing
Device 50 can then choose initial depth image P11 and initial depth image P13 judges the target subject and electronic equipment of first orientation
The target subject that initial depth image P21 and initial depth image P23 judges second orientation is chosen in the distance between 100 variations
The variation of the distance between electronic equipment 100, application processor 50 acquire the initial depth image of first orientation and second orientation
Frame per second become acquiring a frame at interval of a frame, i.e. every two frame chooses a frame.And the frame per second in other orientation remains unchanged, that is, applies
Processor 50 still chooses initial depth image P31 and initial depth image P34 judges that the target subject in third orientation is set with electronics
Standby the distance between 100 variation;Choose the mesh shot that initial depth image P41 and initial depth image P44 judges fourth orientation
The variation of the distance between mark and electronic equipment 100.
Certainly, application processor 50 can also be improved when judging that the corresponding distance in any one orientation reduces from Wei Chu
Initial depth image of the acquisition to judge distance change in multiple initial depth images in each orientation that reason device 40 transmits
Frame per second.That is: when the target subject and electronics for judging first orientation according to initial depth image P11 and initial depth image P14
When the distance between equipment 100 reduces, application processor 50 can then choose initial depth image P11 and initial depth image P13
Judge the variation of the distance between target subject and electronic equipment 100 of first orientation, choose initial depth image P21 and initial depth
Degree image P23 judges the variation of the distance between target subject and electronic equipment 100 of second orientation, chooses initial depth image
P31 and initial depth image P33 judges the variation of the distance between target subject and the electronic equipment 100 in third orientation and chooses
Initial depth image P41 and initial depth image P43 judges the distance between target subject and electronic equipment 100 of fourth orientation
Variation.
Application processor 50 can also judge the distance in conjunction with visible images or infrared light image when distance reduces
Variation.Specifically, application processor 50 first identifies target subject according to visible images or infrared light image, then further according to more
The initial depth image at a moment judges distance change, to set for different target subjects from different distance controlling electronics
Standby 100 execute different operations.Alternatively, the control of microprocessor 40 improves the corresponding transmitting of optical transmitting set 22 and swashs when distance reduces
The frequency etc. that light and optical receiver 24 expose.
It should be noted that the electronic equipment 100 of present embodiment is also used as an external terminal, be fixedly mounted or
It is removably mounted on the portable electronic device such as mobile phone, tablet computer, laptop outside, can also be fixedly mounted
Make in the loose impediments such as vehicle body (as shown in Figure 12 and Figure 13), unmanned aerial vehicle body, robot body or ship ontology
With.When specifically used, when electronic equipment 100 synthesizes a frame panorama depth map according to multiple initial depth images as previously described
Picture, panoramic range image can be used for three-dimensional modeling, immediately positioning and map structuring (simultaneous localization
And mapping, SLAM), augmented reality shows.When the identification target subject as previously described of electronic equipment 100, then can be applied to
Recognition of face unlock, the payment of portable electronic device, or applied to the avoidance of robot, vehicle, unmanned plane, ship etc..When
When electronic equipment 100 judges the variation of the distance between target subject and electronic equipment 100 as previously described, then it can be applied to machine
The automatic runnings such as people, vehicle, unmanned plane, ship, object tracking etc..
Fig. 2 and Figure 16 are please referred to, the application embodiment also provides a kind of mobile platform 300.Mobile platform 300 includes this
Body 10 and the multiple flight time components 20 being arranged on ontology 10.Multiple flight time components 20 are located at the more of ontology 10
A different direction.Each flight time component 20 includes optical transmitting set 22 and optical receiver 24.Optical transmitting set 22 is used for this
Emit laser pulse outside body 10, optical receiver 24 is used to receive the laser that the corresponding optical transmitting set 22 of target subject reflection emits
Pulse.22 time division emission laser pulse of optical transmitting set in the flight time component 20 of adjacent orientation, when the flight of adjacent orientation
Between 24 different-time exposure of optical receiver in component 20, to obtain panoramic range image.
Specifically, ontology 10 can be vehicle body, unmanned aerial vehicle body, robot body or ship ontology.
Figure 16 is please referred to, when ontology 10 is vehicle body, the quantity of multiple flight time components 20 is four, and four fly
Row time component 20 is separately mounted to four sides of vehicle body, for example, on the left of headstock, the tailstock, vehicle body, on the right side of vehicle body.Vehicle sheet
Body can drive multiple flight time components 20 to move on road, construct 360 degree of panoramic range images in travelling route, with
As Reference Map etc.;Or the initial depth image in multiple and different orientation is obtained, to identify target subject, judge target subject
The variation of the distance between mobile platform 300, thus control vehicle body acceleration, slow down, stop, detour etc., realize that nobody drives
Avoidance is sailed, for example, target subject reduces at a distance from vehicle and target subject is if recognizing in vehicle when moving on road
Pit on road, then vehicle is slowed down with the first acceleration, is reduced at a distance from vehicle if recognizing target subject and is shot mesh
It is designated as people, then vehicle is slowed down with the second acceleration, wherein absolute value of the absolute value of the first acceleration less than the second acceleration.Such as
This, executes different operations according to different target subjects when distance reduces, vehicle can be made more intelligent.
Figure 17 is please referred to, when ontology 10 is unmanned aerial vehicle body, the quantity of multiple flight time components 20 is four, four
Flight time component 20 is separately mounted to four side of front, rear, left and right of unmanned aerial vehicle body, or is mounted on unmanned aerial vehicle body and takes
Four side of front, rear, left and right of the holder of load.Unmanned aerial vehicle body can drive multiple flight time components 20 to fly in the sky, with into
Row takes photo by plane, inspection etc., and the panoramic range image that unmanned plane can will acquire is returned to ground control terminal, can also directly carry out SLAM.
Multiple flight time components 20 can realize unmanned plane acceleration, deceleration, stopping, avoidance, object tracking.
Figure 18 is please referred to, when ontology 10 is robot body, such as sweeping robot, multiple flight time components 20
Quantity is four, and four flight time components 20 are separately mounted to four side of front, rear, left and right of robot body.Robot body
Multiple flight time components 20 can be driven to move at home, the initial depth image in multiple and different orientation is obtained, to identify quilt
It takes the photograph target, judge target subject and the variation of the distance between mobile platform 300 to control robot body movement realizes machine
Device people removes rubbish, avoidance etc..
Figure 19 is please referred to, when ontology 10 is ship ontology, the quantity of multiple flight time components 20 is four, and four fly
Row time component 20 is separately mounted to four side of front, rear, left and right of ship ontology.Ship ontology can drive flight time component
20 movements, obtain the initial depth image in multiple and different orientation, to accurately know in adverse circumstances (such as under the environment that hazes)
Other target subject judges that the distance between target subject and mobile platform 300 change, and improves sea going safety etc..
The mobile platform 300 of the application embodiment be can movable independently platform, multiple flight time components 20 pacify
On the ontology 10 of mobile platform 300, to obtain panoramic range image.And the electronic equipment of the application embodiment 100
Body generally can not be moved independently, and electronic equipment 100 can further be equipped on the dress that can be moved similar to mobile platform 300 etc.
It sets, so that the device be helped to obtain panoramic range image.
It should be pointed out that it is above-mentioned to the ontology 10 of electronic equipment 100, it is flight time component 20, CCD camera assembly 30, micro-
The explanation of processor 40 and application processor 50 is equally applicable to the mobile platform 300 of the application embodiment, herein not
Repeat explanation.
Although embodiments herein has been shown and described above, it is to be understood that above-described embodiment is example
Property, it should not be understood as the limitation to the application, those skilled in the art within the scope of application can be to above-mentioned
Embodiment is changed, modifies, replacement and variant, and scope of the present application is defined by the claims and their equivalents.
Claims (19)
1. a kind of electronic equipment, which is characterized in that the electronic equipment includes:
Ontology;With
Multiple flight time components on the body are set, and multiple flight time components are located at the ontology
Multiple and different orientation, each flight time component include optical transmitting set and optical receiver, the optical transmitting set be used for
Emit laser pulse outside the ontology, the optical receiver is used to receive the corresponding optical transmitting set hair of target subject reflection
The laser pulse penetrated;
Laser pulse described in the optical transmitting set time division emission in the flight time component of adjacent orientation, adjacent orientation
The optical receiver different-time exposure in the flight time component, to obtain panoramic range image.
2. electronic equipment according to claim 1, which is characterized in that the flight time component includes four, Mei Gesuo
The field angle for stating optical transmitting set and each optical receiver is arbitrary value in 80 degree~100 degree.
3. electronic equipment according to claim 1, which is characterized in that the light hair in multiple flight time components
Laser pulse described in emitter time division emission, the optical receiver different-time exposure in multiple flight time components, when any
The light hair when optical receiver in one flight time component exposes, in other flight time components
Emitter is turned off.
4. electronic equipment according to claim 3, which is characterized in that multiple described in multiple flight time components
Optical transmitting set successively connects and emits incessantly the laser pulse, the light-receiving in each flight time component
The time for exposure of device is located in the launching phase of the corresponding optical transmitting set.
5. electronic equipment according to claim 3, which is characterized in that multiple described in multiple flight time components
Optical transmitting set successively connects and is spaced the predetermined time transmitting laser pulse, multiple institutes in multiple flight time components
Optical receiver is stated successively to connect and be spaced predetermined time exposure.
6. electronic equipment according to claim 3, which is characterized in that multiple described in multiple flight time components
Optical transmitting set successively connects and is spaced the predetermined time transmitting laser pulse, multiple institutes in multiple flight time components
Optical receiver is stated successively to connect and expose incessantly.
7. electronic equipment according to claim 1, which is characterized in that the electronic equipment further includes application processor and more
A microprocessor, the corresponding flight time component of each microprocessor, multiple microprocessors with it is described
Application processor connection, each microprocessor are used to be sent out according to the optical transmitting set of the corresponding flight time component
The received laser pulse of the laser pulse and the optical receiver penetrated obtains initial depth image and is transmitted to described
Application processor;What the application processor was used to be obtained multiple microprocessors according to the field angle of the optical receiver
Multiple initial depth images synthesize panoramic range image described in a frame.
8. electronic equipment according to claim 1, which is characterized in that the electronic equipment further includes application processor and one
A microprocessor, the microprocessor are connect with the application processor, and the microprocessor is used for successively according to multiple described
The laser pulse and the received laser pulse of the optical receiver of the optical transmitting set transmitting of flight time component
It obtains multiple initial depth images and is transmitted to the application processor;The application processor is used for according to the optical receiver
Field angle multiple initial depth images for obtaining the microprocessor synthesize panoramic range image described in a frame.
9. electronic equipment according to claim 1, which is characterized in that the electronic equipment further includes application processor and more
A microprocessor, the corresponding flight time component of each microprocessor, multiple microprocessors with it is described
Application processor connection, each microprocessor are used to be sent out according to the optical transmitting set of the corresponding flight time component
The received laser pulse of the laser pulse and the optical receiver penetrated obtains initial depth image and is transmitted to described
Application processor;
The electronic equipment further includes the multiple CCD camera assemblies of setting on the body, and each CCD camera assembly is corresponding
One flight time component, multiple CCD camera assemblies are connect with the application processor, each camera
Component is used to acquire the scene image of the target subject and exports to the application processor;
The multiple initial depth images and multiple institutes that the application processor is used to be obtained according to multiple microprocessors
The multiple scene images for stating CCD camera assembly acquisition identify the target subject.
10. electronic equipment according to claim 9, which is characterized in that the application processor is also used to according to multiple
When the initial depth image and multiple scene images identify target subject failure, according to the view of the optical receiver
At least two initial depth images that at least two microprocessors obtain are synthesized a frame and merge depth map by rink corner
At least two scene images of at least two CCD camera assembly acquisitions are synthesized a frame and merge scene image by picture,
And the target subject is identified according to the merging depth image and the merging scene image.
11. electronic equipment according to claim 1, which is characterized in that the electronic equipment further include application processor and
One microprocessor, the microprocessor are connect with the application processor, and the microprocessor is used for successively according to multiple institutes
State the laser pulse and the received laser arteries and veins of the optical receiver of the optical transmitting set transmitting of flight time component
Punching obtains multiple initial depth images and is transmitted to the application processor;
The electronic equipment further includes the multiple CCD camera assemblies of setting on the body, and each CCD camera assembly is corresponding
One flight time component, multiple CCD camera assemblies are connect with the application processor, each camera
Component is used to acquire the scene image of the target subject and exports to the application processor;
The application processor be used for multiple initial depth images for being obtained according to the microprocessor and it is multiple described in take the photograph
The multiple scene images acquired as head assembly identify the target subject.
12. electronic equipment according to claim 11, which is characterized in that the application processor is also used to according to multiple
When the initial depth image and multiple scene images identify target subject failure, according to the view of the optical receiver
At least two initial depth images that rink corner obtains the microprocessor synthesize a frame and merge depth image, will at least
At least two scene images of two CCD camera assembly acquisitions synthesize a frame and merge scene image, and according to described
Merge depth image and the merging scene image identifies the target subject.
13. electronic equipment according to claim 1, which is characterized in that the electronic equipment further include application processor and
Multi-microprocessor, the corresponding flight time component of each microprocessor, multiple microprocessors are and institute
Application processor connection is stated, each microprocessor is used for the optical transmitting set according to the corresponding flight time component
Repeatedly the received laser pulse obtains multiple initial depth figures to the laser pulse and the optical receiver repeatedly emitted
Picture is simultaneously transmitted to the application processor;The application processor is used to judge the quilt according to multiple initial depth images
Take the photograph the variation of the distance between target and the electronic equipment.
14. electronic equipment according to claim 13, which is characterized in that the application processor is also used to according to multiple
It, will be described at least two according to the field angle of the optical receiver when initial depth image judges distance change failure
At least two initial depth images that microprocessor obtains synthesize a frame and merge depth image, and the application processor connects
The continuous synthesis step that executes is to obtain the multiframe continuously merging depth image, and the merging depth image judgement according to multiframe
The distance change.
15. electronic equipment according to claim 1, which is characterized in that the electronic equipment further include application processor and
One microprocessor, the microprocessor are connect with the application processor, and the microprocessor is used for successively according to multiple institutes
State the laser pulse that the optical transmitting set of flight time component repeatedly emits and the optical receiver repeatedly received institute
Laser pulse is stated to obtain multiple initial depth images and be transmitted to the application processor;The application processor is used for basis
The corresponding multiple initial depth images of each flight time component judge the target subject and the electronic equipment
The distance between variation.
16. electronic equipment according to claim 15, which is characterized in that the application processor is also used to according to each
When the corresponding multiple initial depth images of the flight time component judge distance change failure, connect according to the light
Receive at least two flight time components that the field angle of device obtains the microprocessor it is corresponding at least two it is described just
Beginning range image integration is that a frame merges depth image, and it is continuous to obtain multiframe that the application processor continuously performs synthesis step
The merging depth image, and according to multiframe merging depth image judge the distance change.
17. electronic equipment described in 3 to 16 any one according to claim 1, which is characterized in that the application processor is also used
In when judging that the distance change reduces for distance, the multiple initial depth images transmitted from the microprocessor are improved
The middle frame per second acquired to judge the initial depth image of the distance change.
18. a kind of mobile platform, which is characterized in that the mobile platform includes:
Ontology;With
Multiple flight time components on the body are set, and multiple flight time components are located at the ontology
Multiple and different orientation, each flight time component include optical transmitting set and optical receiver, the optical transmitting set be used for
Emit laser pulse outside the ontology, the optical receiver is used to receive the corresponding optical transmitting set hair of target subject reflection
The laser pulse penetrated;
Laser pulse described in the optical transmitting set time division emission in the flight time component of adjacent orientation, adjacent orientation
The optical receiver different-time exposure in the flight time component, to obtain panoramic range image.
19. mobile platform according to claim 18, which is characterized in that the ontology be vehicle body, unmanned aerial vehicle body,
Robot body or ship ontology.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910007545.8A CN109788195B (en) | 2019-01-04 | 2019-01-04 | Electronic equipment and mobile platform |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910007545.8A CN109788195B (en) | 2019-01-04 | 2019-01-04 | Electronic equipment and mobile platform |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109788195A true CN109788195A (en) | 2019-05-21 |
CN109788195B CN109788195B (en) | 2021-04-16 |
Family
ID=66500037
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910007545.8A Active CN109788195B (en) | 2019-01-04 | 2019-01-04 | Electronic equipment and mobile platform |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109788195B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111246073A (en) * | 2020-03-23 | 2020-06-05 | 维沃移动通信有限公司 | Imaging device, method and electronic equipment |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101494736A (en) * | 2009-02-10 | 2009-07-29 | 杨立群 | Filming system |
CN101984463A (en) * | 2010-11-02 | 2011-03-09 | 中兴通讯股份有限公司 | Method and device for synthesizing panoramic image |
CN102129550A (en) * | 2011-02-17 | 2011-07-20 | 华南理工大学 | Scene perception method |
CN104055489A (en) * | 2014-07-01 | 2014-09-24 | 李栋 | Blood vessel imaging device |
US20170019594A1 (en) * | 2015-07-13 | 2017-01-19 | Futurewei Technologies, Inc. | Increasing spatial resolution of panoramic video captured by a camera array |
CN106371281A (en) * | 2016-11-02 | 2017-02-01 | 辽宁中蓝电子科技有限公司 | Multi-module 360-degree space scanning and positioning 3D camera based on structured light |
CN106461783A (en) * | 2014-06-20 | 2017-02-22 | 高通股份有限公司 | Automatic multiple depth cameras synchronization using time sharing |
US9653874B1 (en) * | 2011-04-14 | 2017-05-16 | William J. Asprey | Trichel pulse energy devices |
CN107263480A (en) * | 2017-07-21 | 2017-10-20 | 深圳市萨斯智能科技有限公司 | A kind of robot manipulation's method and robot |
CN107742296A (en) * | 2017-09-11 | 2018-02-27 | 广东欧珀移动通信有限公司 | Dynamic image generation method and electronic installation |
US20180139431A1 (en) * | 2012-02-24 | 2018-05-17 | Matterport, Inc. | Capturing and aligning panoramic image and depth data |
CN108471487A (en) * | 2017-02-23 | 2018-08-31 | 钰立微电子股份有限公司 | Generate the image device and associated picture device of panoramic range image |
CN108541304A (en) * | 2015-04-29 | 2018-09-14 | 苹果公司 | Flight time depth map with flexible scan pattern |
CN108616703A (en) * | 2018-04-23 | 2018-10-02 | Oppo广东移动通信有限公司 | Electronic device and its control method, computer equipment and readable storage medium storing program for executing |
CN108810500A (en) * | 2017-12-22 | 2018-11-13 | 成都理想境界科技有限公司 | The method of adjustment of spliced scanning imagery equipment and spliced scanning imagery equipment |
-
2019
- 2019-01-04 CN CN201910007545.8A patent/CN109788195B/en active Active
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101494736A (en) * | 2009-02-10 | 2009-07-29 | 杨立群 | Filming system |
CN101984463A (en) * | 2010-11-02 | 2011-03-09 | 中兴通讯股份有限公司 | Method and device for synthesizing panoramic image |
CN102129550A (en) * | 2011-02-17 | 2011-07-20 | 华南理工大学 | Scene perception method |
US9653874B1 (en) * | 2011-04-14 | 2017-05-16 | William J. Asprey | Trichel pulse energy devices |
US20180139431A1 (en) * | 2012-02-24 | 2018-05-17 | Matterport, Inc. | Capturing and aligning panoramic image and depth data |
CN106461783A (en) * | 2014-06-20 | 2017-02-22 | 高通股份有限公司 | Automatic multiple depth cameras synchronization using time sharing |
CN104055489A (en) * | 2014-07-01 | 2014-09-24 | 李栋 | Blood vessel imaging device |
CN108541304A (en) * | 2015-04-29 | 2018-09-14 | 苹果公司 | Flight time depth map with flexible scan pattern |
US20170019594A1 (en) * | 2015-07-13 | 2017-01-19 | Futurewei Technologies, Inc. | Increasing spatial resolution of panoramic video captured by a camera array |
CN106371281A (en) * | 2016-11-02 | 2017-02-01 | 辽宁中蓝电子科技有限公司 | Multi-module 360-degree space scanning and positioning 3D camera based on structured light |
CN108471487A (en) * | 2017-02-23 | 2018-08-31 | 钰立微电子股份有限公司 | Generate the image device and associated picture device of panoramic range image |
CN107263480A (en) * | 2017-07-21 | 2017-10-20 | 深圳市萨斯智能科技有限公司 | A kind of robot manipulation's method and robot |
CN107742296A (en) * | 2017-09-11 | 2018-02-27 | 广东欧珀移动通信有限公司 | Dynamic image generation method and electronic installation |
CN108810500A (en) * | 2017-12-22 | 2018-11-13 | 成都理想境界科技有限公司 | The method of adjustment of spliced scanning imagery equipment and spliced scanning imagery equipment |
CN108616703A (en) * | 2018-04-23 | 2018-10-02 | Oppo广东移动通信有限公司 | Electronic device and its control method, computer equipment and readable storage medium storing program for executing |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111246073A (en) * | 2020-03-23 | 2020-06-05 | 维沃移动通信有限公司 | Imaging device, method and electronic equipment |
CN111246073B (en) * | 2020-03-23 | 2022-03-25 | 维沃移动通信有限公司 | Imaging device, method and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
CN109788195B (en) | 2021-04-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109862275A (en) | Electronic equipment and mobile platform | |
EP3603055B1 (en) | Depth sensing techniques for virtual, augmented, and mixed reality systems | |
US10503265B2 (en) | Mixed-mode depth detection | |
EP1504597B1 (en) | Method for displaying an output image on an object | |
CN109618108A (en) | Electronic equipment and mobile platform | |
US7453511B2 (en) | Apparatus and method for inputting reflected light image of a target object | |
CN112189147B (en) | Time-of-flight (TOF) camera and TOF method | |
CN1679345A (en) | Pointed position detection device and pointed position detection method | |
CN107429998B (en) | Range image acquisition device and range image acquisition methods | |
US11240481B2 (en) | Creation and user interactions with three-dimensional wallpaper on computing devices | |
CN109587303A (en) | Electronic equipment and mobile platform | |
CN109688400A (en) | Electronic equipment and mobile platform | |
US10834323B2 (en) | Electronic apparatus, motion sensor, position change detection program, and position change detection method | |
CN117173756A (en) | Augmented reality AR system, computer equipment and storage medium | |
CN110933290A (en) | Virtual photographing integrated system and method based on human-computer interaction | |
CN109618085A (en) | Electronic equipment and mobile platform | |
CN109660731A (en) | Electronic equipment and mobile platform | |
US20240244330A1 (en) | Systems and methods for capturing and generating panoramic three-dimensional models and images | |
WO2022161386A1 (en) | Pose determination method and related device | |
CN109587304A (en) | Electronic equipment and mobile platform | |
CN108287345A (en) | Spacescan method and system based on point cloud data | |
CN109803089A (en) | Electronic equipment and mobile platform | |
CN109788195A (en) | Electronic equipment and mobile platform | |
CN107462248A (en) | A kind of indoor optical positioning system and its application method | |
CN109660733A (en) | Electronic equipment and mobile platform |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |