CN109819173A - Depth integration method and TOF camera based on TOF imaging system - Google Patents

Depth integration method and TOF camera based on TOF imaging system Download PDF

Info

Publication number
CN109819173A
CN109819173A CN201711170732.5A CN201711170732A CN109819173A CN 109819173 A CN109819173 A CN 109819173A CN 201711170732 A CN201711170732 A CN 201711170732A CN 109819173 A CN109819173 A CN 109819173A
Authority
CN
China
Prior art keywords
data
depth
tof
target object
imaging system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711170732.5A
Other languages
Chinese (zh)
Other versions
CN109819173B (en
Inventor
陈立刚
赵俊能
周劲蕾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Sunny Optical Intelligent Technology Co Ltd
Original Assignee
Zhejiang Sunny Optical Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Sunny Optical Intelligent Technology Co Ltd filed Critical Zhejiang Sunny Optical Intelligent Technology Co Ltd
Priority to CN201711170732.5A priority Critical patent/CN109819173B/en
Publication of CN109819173A publication Critical patent/CN109819173A/en
Application granted granted Critical
Publication of CN109819173B publication Critical patent/CN109819173B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a depth integration methods and TOF camera based on TOF imaging system, wherein, the TOF imaging system is applied to obtain the depth information of an at least target object, comprising the following steps: obtains at least one long exposure data and an at least short exposure data of the target object;An at least depth data for the target object is obtained, wherein the depth data includes at least one short depth data and at least one long depth data;And the short depth data value and the long depth data value are merged, to obtain an at least fused data for the target object.The depth integration method merges the image of the target object based on the depth information of target object, to obtain the complete depth information of the target object.

Description

Depth integration method and TOF camera based on TOF imaging system
Technical field
The present invention relates to a field of photography, in particular to one depth integration method and TOF phase based on TOF imaging system Machine, the present invention is concentrated in embedded end realizes the TOF imaging system to the depth integration of image.
Background technique
With the continuous development of science and technology, machine vision is fast-developing as an important branch of artificial intelligence.Letter For list, machine vision is exactly to configure machine so that machine has human eye functions, even more so that machine has than human eye function Visual performance that can be more powerful, measures and judges so that machine can intelligently do target object instead of human eye.It is specific and Speech, NI Vision Builder for Automated Inspection are exactly to pass through machine vision product (such as image acquisition equipment) target object is converted to image letter Number, and it is transferred to dedicated image processing system, the shape information of target object is obtained, image processing system is to pixel point The signals such as cloth, brightness, color carry out various operations to extract the feature of target object, and then are controlled now according to the result of differentiation The device action of field.
Machine vision is all widely used in many fields, for example machine vision can be used in oneself for unmanned plane of taking photo by plane Motion tracking and avoidance, VR (virtual reality), AR (augmented reality), ADAS (advanced driver Assistant system) and SLAM (simultaneous localization and mapping) etc. it is multi-field in, to replace It is essential and most important to obtain the various information of target object, and in the application of machine vision for artificial vision A function be exactly depth perception, accurately obtain target object depth data information just can for it is subsequent measurement or answer With preparing, TOF is one of the important technical for realizing depth perception.
The pulse signal that time-of-flight method (Time Of Flight, TOF) is issued by measurement measuring instrument is from being emitted to Received time interval t (being commonly referred to as pulse ranging method) or the round-trip target object of laser once caused by phase (phase difference Telemetry) realize the measurement of three-dimensional structure or three-D profile to target object (or target object detection zone).Using TOF The instrument that technology measures is defined as TOF measuring instrument, such as TOF camera, the TOF camera to the depth of target object or The measurement of three-dimensional structure is mainly based upon the measurement of the phase difference of pulse signal or laser.It generally includes a light source emitting module With a photosensitive receiving module, the light source emitting module is matched with the photosensitive receiving module, and raw based on TOF depth measurement At the depth information of measured target.More specifically, the light source emitting module emits the light wave of a specific band, the transmitting Light wave reflects on the surface of measured target, to be received by the photosensitive receiving module, in turn, the photosensitive receiving module According to transmitting light wave and receive the time difference between light wave or the depth information of phasometer calculating measured target.The TOF Measuring instrument can not only obtain the depth information of measured target, meanwhile, obtain the grayscale information and luminance information of measured target. Just because of TOF test equipment these it is unique refer in particular to widely answered so that TOF test equipment has in terms of machine vision With.
However, existing TOF test equipment obtain target object depth information when, there is a problem of it is more or less, The complete depth information of target object can not be obtained so as to cause the TOF test equipment, and influence depth data are in machine Device vision or other subsequent applications.Specifically, existing TOF test equipment is in the far and near target object of shooting, it is past It is under-exposed or over-exposed toward there is a problem of, so as to cause the loss of the depth information of target object.For example, existing For TOF test equipment when shooting one apart from farther away target object, the suitable time for exposure of the target object is bright by environment The brightness of degree, target object, camera aperture, the multifactor joint effect such as distance of target object, i.e., the conjunction of the described target object The suitable time for exposure is changing whenever and wherever possible, and existing TOF test equipment can not all obtain appropriate target all the time The suitable time for exposure of object, and cause existing TOF test equipment that can have overexposure or exposure in photographic subjects object The problem of light deficiency.Furthermore even if when TOF test equipment controls preferable exposure in the farther away target object of shooting distance Between, at this point, the TOF test equipment in the closer target object of shooting distance according to there are overexposure or under-exposure ask Topic, i.e., the described TOF test equipment can not obtain the complete depth information of far and near article simultaneously.
Once the depth information for the target that the TOF test equipment obtains there are error, will more or less influence described The subsequent applications of depth information.For example, in the avoidance process of unmanned aviation machine, it is assumed that be arranged on the unmanned aviation machine The depth information of target object that obtains of TOF test equipment there are deviation, the unmanned aviation machine obtains the peripheral ring of mistake Border information causes unmanned aviation machine that may collide with periphery barrier and causes unnecessary accident or loss.
In conclusion existing TOF test equipment when obtaining far and near target object, often exist it is over-exposed or Under-exposed problem, i.e., the described TOF test equipment can not obtain the complete depth information of the target object.
Summary of the invention
The purpose of the present invention is to provide a depth integration methods based on TOF imaging system, wherein the depth integration Method merges the image of the target object based on the depth information of target object, to obtain the complete depth of the target object Spend information.
The purpose of the present invention is to provide a depth integration methods based on TOF imaging system, wherein the depth integration Method merges the image of the target object based on the depth information of target object, to be suitable for obtaining the three of the target object Image is tieed up, in other words, the depth integration method is applied to TOF imaging system to obtain complete the three of the target object Tie up image information.
The purpose of the present invention is to provide a depth integration methods based on TOF imaging system, wherein the depth integration Method can solve the TOF imaging system shooting distance is close or apart from remote target object when existing exposure problems, to obtain Take the complete depth information of the target object.
The purpose of the present invention is to provide a depth integration methods based on TOF imaging system, wherein the depth integration Method can obtain the complete depth information apart from farther away target object, and in other words, the depth integration method is applied to The TOF imaging system is to obtain the complete depth information apart from farther away target object.
The purpose of the present invention is to provide a depth integration methods based on TOF imaging system, wherein the depth integration Method can obtain the complete depth information for the target object being closer, and in other words, the depth integration method is applied to The TOF imaging system is to obtain the complete depth information of the target object being closer.
The purpose of the present invention is to provide a depth integration methods based on TOF imaging system, wherein the depth integration Method can be obtained simultaneously apart from the close or depth information apart from remote target object, to improve the imaging of the TOF imaging system Efficiency.
The purpose of the present invention is to provide a depth integration methods based on TOF imaging system, wherein the depth integration Method is applied to the TOF imaging system, rapidly to obtain the complete depth information of the target object, i.e., the described depth The fusion speed for spending fusion method is fast.
The purpose of the present invention is to provide a depth integration methods based on TOF imaging system, wherein the depth integration Method can be by way of once obtaining image, the complete depth information of target object described in quick obtaining, described to facilitate The imaging operation of TOF imaging system.
The purpose of the present invention is to provide a depth integration methods based on TOF imaging system, wherein the TOF imaging System is used in combination with the host computer of polymorphic type, i.e., the depth integration data that the described TOF imaging system obtains can be transmitted to more The host computer of type.
The purpose of the present invention is to provide a depth integration methods based on TOF imaging system, wherein the depth integration Data are transmitted to NI Vision Builder for Automated Inspection and are applied, to improve the vision-based detection efficiency and accuracy of the NI Vision Builder for Automated Inspection.
The purpose of the present invention is to provide a depth integration methods based on TOF imaging system, wherein the depth integration Method combination automatic exposure algorithm uses, to avoid the loss of the depth information of length exposure transition region, to obtain the target The complete depth information of object.
In order to realize that any of the above goal of the invention, the present invention provide depth integration side's method based on TOF imaging system, A wherein depth integration method based on TOF imaging system, the TOF imaging system are applied to obtain an at least target object Depth information, comprising the following steps:
S1: one exposure data module obtains at least one long exposure data and an at least short exposure number of the target object According to;
S2: one computing module obtains an at least depth data for the target object, wherein the depth data includes at least One short depth data and at least one long depth data;And
S3: one Fusion Module merges the short depth data and the long depth data, to obtain the target object An at least fused data.
In some embodiments, wherein the TOF imaging system further comprises following step after the step S2 It is rapid:
S4: the adjustment short depth data.
In some embodiments, wherein the step S4 is further included steps of
S41: judging whether the short depth data meets the requirements, if not being inconsistent standardization thens follow the steps S41;
S42: one automatic exposure module of starting is to adjust the short exposure data;And
S43: the short depth data is adjusted according to the short exposure data.
In some embodiments, wherein the step S1 is further included steps of
S11: one judgment module judges the type of the target object, executes step when the target object is a remote scenario objects Rapid S12 executes step S13 when the target object is a nearly scenario objects;
S12: the exposure data module obtain the target object the long exposure data of an at least double frequency and the short exposure Light data;And
S13: the exposure data module obtain the target object the long exposure data of an at least single-frequency and the short exposure Light data.
In some embodiments, wherein the step S2 is further included steps of
S21: the computing module obtains the long exposure data and the short exposure data respectively;
S22: the computing module judges whether to need to carry out according to the long exposure data and the short exposure data Multicore operation executes step S23 when needing to be implemented multicore operation;And
S23: dividing the long exposure data and/or the short exposure data, and multicore operation obtains the depth data.
In some embodiments, wherein the step S3 is further included steps of
S31: when the corresponding numerical value of the corresponding coordinate pixel of the long depth data is greater than a threshold value, then the coordinate pixel It is assigned a value of the long depth data;And
S32: when the corresponding numerical value of the corresponding coordinate pixel of the long depth data is less than or equal to the threshold value, then described Coordinate pixel assignment is the short depth data.
In some embodiments, wherein the threshold value is implemented as the closer objects under long exposure mode and is in best deep Spend the depth value of effect.
In some embodiments, wherein the depth integration method further comprises following step before the step S1 It is rapid:
S0: the optical power parameter of a setting at least TOF light intensity sensor.
In some embodiments, wherein the depth integration method further comprises following step after the step S3 It is rapid:
S5: the fused data is transmitted to an at least host computer by a data-interface.
Other side under this invention, the present invention further provides a TOF cameras, wherein the TOF camera is according to institute The depth information that depth integration method obtains a target object is stated, wherein the depth integration method includes the following steps:
S1: one exposure data module obtains at least one long exposure data and an at least short exposure number of the target object According to;
S2: one computing module obtains an at least depth data for the target object, wherein the depth data includes at least One short depth data and at least one long depth data;And
S3: one Fusion Module merges the short depth data and the long depth data, to obtain the target object An at least fused data.
Detailed description of the invention
Fig. 1 is the practical application figure of the TOF imaging system of an embodiment according to the present invention, wherein TOF imaging system System is implemented as a TOF camera.
Fig. 2 is the structural schematic diagram of the TOF mould group of TOF camera according to the abovementioned embodiments of the present invention.
Fig. 3 is the operation principle schematic diagram of the TOF mould group according to the abovementioned embodiments of the present invention.
Fig. 4 is the structure schematic diagram of the TOF mould group according to the abovementioned embodiments of the present invention.
Fig. 5 is the perspective view of the explosion of the TOF mould group according to the abovementioned embodiments of the present invention.
Fig. 6 is the composition block diagram schematic diagram of the TOF imaging system according to the abovementioned embodiments of the present invention.
Fig. 7 is a data processing unit of the TOF imaging system according to the abovementioned embodiments of the present invention based on Fig. 6 Composition schematic diagram.
Fig. 8 is another composition schematic diagram of the data processing unit according to the abovementioned embodiments of the present invention.
Fig. 9 is the data flow of the depth integration method based on TOF imaging system according to the abovementioned embodiments of the present invention Schematic diagram.
Figure 10 to Figure 15 is the stream of the depth integration method based on TOF imaging system according to the abovementioned embodiments of the present invention Journey schematic diagram.
Specific embodiment
It is described below for disclosing the present invention so that those skilled in the art can be realized the present invention.It is excellent in being described below Embodiment is selected to be only used as illustrating, it may occur to persons skilled in the art that other obvious modifications.It defines in the following description Basic principle of the invention can be applied to other embodiments, deformation scheme, improvement project, equivalent program and do not carry on the back Other technologies scheme from the spirit and scope of the present invention.
It will be understood by those skilled in the art that in exposure of the invention, term " longitudinal direction ", " transverse direction ", "upper", The orientation of the instructions such as "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom" "inner", "outside" or position are closed System is to be based on the orientation or positional relationship shown in the drawings, and is merely for convenience of description of the present invention and simplification of the description, without referring to Show or imply that signified device or element must have a particular orientation, be constructed and operated in a specific orientation, therefore above-mentioned art Language is not considered as limiting the invention.
It is understood that term " one " is interpreted as " at least one " or " one or more ", i.e., in one embodiment, The quantity of one element can be one, and in a further embodiment, the quantity of the element can be it is multiple, term " one " is no It can be interpreted as the limitation to quantity.
It is and traditional general as shown in Figure 1, a TOF camera 100 is applied to obtain the image of an at least target object 200 Unlike logical camera, the TOF camera 100 can obtain the 3-D image letter of the target object 200 according to aircraft Time Method Breath, in other words, the TOF camera 100 can obtain the 3D rendering of the target object 200.The TOF camera 100 obtains described The process of the three-dimensional information of target object 200 is described by being entered by the light of the target object 200 for simplifying TOF camera 100, and received by a photosensitive element of the TOF camera 100, and the photosensitive element carries out the light After analysis processing, the photosensitive element converts optical path information to the image information of the target object 200, to complete to described The imaging of target object 200, and the photosensitive element of the TOF camera 100 can be obtained by optical path phase difference it is described The 3-D image of target object 200.
However, existing TOF camera 100 is when obtaining the depth information of target object 200, there are more or less to ask Topic, the complete depth information of target object 200 can not be obtained so as to cause the TOF camera 100.Specifically, existing TOF camera 100 can have overexposure or under-exposure, and expose in the remote or close target object 200 of shooting distance Problem dramatically influences the image information of target object 200 again.To solve the above problems, the present invention, which provides one, is based on TOF The depth integration method of imaging system, the depth integration method can be merged based on the depth information of the target object 200 to be schemed Picture, to obtain the complete depth information of the target object 200.
It is worth noting that, the present invention will be said so that the TOF imaging system is implemented as a TOF camera 100 as an example It is bright, but the people for being familiar with this technology should be understood that the TOF imaging system is not limited only to TOF camera 100, TOF imaging system System is implemented as arbitrarily being provided with the built-in end formula of TOF mould group, for example the TOF imaging system can be implemented as but be not limited to TOF mobile phone, TOF monitor, TOF vision eye etc., the present invention are unrestricted in this regard.
As shown in fig. 6, Fig. 6 is the TOF imaging system composition block diagram schematic diagram according to the abovementioned embodiments of the present invention. Wherein the TOF imaging system includes at least a TOF mould group 300 and at least one working cell 60, wherein the TOF mould group 300 are communicatively coupled to the working cell 60, also, the working cell 60 can control the TOF mould group 300, and institute The image data for stating the acquisition of TOF mould group 300 can transmit the working cell 60, to be handled by the working cell 60.Change speech It, the TOF mould group 300 is applied to obtain the image information of the target object 200, and the working cell 60 can control institute TOF mould group 300 is stated, and handles the described image information that the TOF mould group 300 obtains, to complete the TOF imaging system pair The imaging of target object 200.In an embodiment of the present invention, the working cell 60 controls the TOF mould group 300, so that institute State the complete depth information that TOF mould group 300 obtains the target object 200.
Specifically, the working cell 60 cooperates the TOF mould group 300 to work, so that the TOF imaging system can Obtain the complete depth information of the target object 200.In other words, depth integration method provided by the invention can be by described TOF mould group 300 and the working cell 60 are completed.
The present invention provides a depth integration method based on TOF imaging system, wherein the depth integration method is set In the working cell 60, so that the TOF imaging system obtains the target object 200 in the form of depth integration image Complete depth information.The working cell 60 handles the initial graph for the target object 200 that the TOF mould group 300 obtains As data, and the original data is corrected in a manner of depth integration, obtain an at least depth integration data, so that The TOF imaging system can obtain the complete depth information of the target object 200.
As shown in Fig. 2, the TOF mould group 300 includes at least one for providing the light source module with preset wavelength laser 10 and at least one illuminant module 20, wherein the illuminant module 20 includes an at least TOF light intensity sensor 21, wherein the light source Module 10 can generate the laser with preset wavelength, and emit the laser to the target object 200, and the TOF light intensity passes Sensor 21, which is set, can receive the laser reflected by the target object 200, and generate image information.
It is worth noting that, the illuminant module 20 is communicatively coupled to the working cell 60, the working cell 60 Including an at least data processing unit 61, wherein the data processing unit 61 is provided to receive the TOF light intensity sensor The original data of 21 target objects 200 generated, that is, the TOF light intensity sensor 21 is communicatively coupled with institute State data processing unit 61.
Specifically, the light source module 10 and illuminant module 20 form depth detection system, to detect the target The case depth of object 200, to obtain the depth information data of the target object 200.It is understood that the light source The laser that module 10 emits further is incuded and is detected by the TOF light intensity sensor 21 by after the target object 200 reflection It arrives.Therefore, each laser point data that the TOF light intensity sensor 21 detects all has depth (value) information, the TOF light Strong sensor 21 is communicatively coupled with the data processing unit 61, to obtain the initial graph of the target object 200 As data.Skilled person will appreciate that the laser that the light source module 10 of the TOF mould group 300 emits can be it is infrared Light.Preferably, the laser that the light source module 10 issues is the laser with a preset wavelength.
In addition, as shown in fig. 6, the working cell 60 further comprises a control unit 62, wherein described control unit 62 are set and can control the fortune of the TOF light intensity sensor 21 according to control instruction (such as from the control instruction of host computer) Row, that is, the TOF light intensity sensor 21 is controllably connected to described control unit 62.Described control unit 62 can also root The TOF light intensity sensor 21 is controlled according to pre-set programs to run.Further, described control unit 62, which is set, can control institute The operation for stating the other structures module of working cell 60, data processing unit 61 is to the TOF light intensity sensor as described in control The 21 obtained original datas are handled deeply, that is, the data processing unit 61 is controllably connected to described Control unit 62.In one embodiment of this invention, described control unit 62 is carried out a reduced instruction set computer, below will be with RISC The reduced instruction set computer is referred to, the RISC controls the operation of the TOF mould group 300, and controls the working cell 60 The operation of other structures module.
It is worth noting that, in an embodiment of the present invention, described control unit 62 and the data processing unit 61 It is communicatively coupled to the TOF mould group 300, the TOF imaging system of the invention can be realized pair in a manner of depth integration The processing of the image of the target object 200.Specifically, described control unit 62 can control the light source module 10 to outgoing The transmitting light beam with preset wavelength is penetrated, the transmitting light beam forms an at least the reflected beams after being reflected by the target object, After the reflected beams are received by the illuminant module 20, then it is converted on the TOF light intensity sensor 21 described first Beginning image data.The data processing unit 61 handles the original data to obtain an at least fused data R, described to melt Closing data R ensures that the TOF imaging system obtains the complete depth information of the target object 200.
In an embodiment of the present invention, the TOF light intensity sensor 21 is implemented as an imaging sensor, and described image passes Sensor can be implemented as CMOS or CCD, and described image sensor receives the reflected beams and generates described image information. In one embodiment of this invention, the data processing unit 61 is implemented as a digital signal processor, to handle various figures Melt as data and various environmental datas, and described in the target object 200 being obtained according to the depth integration method Close data R.But the people for being familiar with this technology should be understood that the TOF light intensity sensor 21, described control unit 62 and described The concrete type of data processing unit 61 is not limited by of the invention.
In addition, the working cell 60 further comprises a data-interface 63, so that the data processing unit 61 obtains The fused data R obtained can be transmitted to host computer.For example, by a MIPI data-interface, by the fusion number Host computer is transferred to according to R.The host computer can be implemented as all kinds of built-in end formula equipment, such as computer, robot, mobile phone etc.. It is noted that the type of the data-interface 63 is also unrestricted in the present invention, according to the interface of the host computer Type is selected.
The working cell 60 further comprises a setting unit 64, and the setting unit 64 can be applied to described in setting All kinds of intrinsic parameters of TOF mould group 300, and the setting unit 64 can also parameter information.Specifically, the setting Unit 64 is communicatively coupled to described control unit 62, and described control unit 62 controls the setting unit 64 and is arranged about described All kinds of parameters of TOF mould group 300, such as the frame per second and exposure upper-limit of the TOF mould group 300, such as the TOF mould group 300 Threshold value etc..Also, the setting unit 64 communicatedly connects the data processing unit 61, and the data processing unit 61 is joined It examines or using the parameter being arranged in the setting unit 64, to obtain the fused data of the target object 200.
In conclusion the working cell 60 includes the data processing unit 61, described control unit 62, the data Interface 63 and the setting unit 64, wherein the data processing unit 61 is applied to handle the TOF light intensity sensor 21 original datas obtained, and the fused data R is obtained according to the depth integration method, wherein the control Unit 62 controls the data processing unit 61 and handles the original data, and the control TOF mould group 300 obtains institute State the process of the original data of target object 200, wherein the setting unit 64 be applied to be arranged it is a series of solid Have parameter so that the TOF mould group 300 is worked with the intrinsic parameter of setting, wherein the data-interface 63 transmit it is described just Beginning image data and the fused data R are to host computer described at least one, so that the host computer carries out subsequent processing. It is noted that the working cell 60 may also include other function unit module, the present invention is in this respect with no restrictions.
As shown in Fig. 2 to Fig. 5, the structure of the TOF module is set forth introduction.The TOF mould group 300 further comprises one Wiring board 30, wherein the light source module 10 and the illuminant module 20 are communicatively connected in a network in the wiring board 30, it is excellent Selection of land, the light source module 10 and the illuminant module 20 are arranged at the wiring board 30.It is, of the invention one In embodiment, the light source module 10 is set to the wiring board 30 with the illuminant module 20 with being integrated, and on the one hand makes On the other hand the miniaturization that the TOF mould group 300 has compactedness structure and is conducive to the TOF mould group 300 is conducive to improve The depth measurement precision of the TOF mould group 300.At this point, the working cell 60 is communicatively coupled with the wiring board 30, with Realize the communication connection with the TOF mould group 300.It is noted that the working cell 60 can control the light source module 10 imaging for shining and controlling the illuminant module 20.
Certainly, the wiring board 30 includes but is not limited to rigid circuit board, flexible electric circuit board, Rigid Flex, Yi Jitao Porcelain and pcb board.In a preferred embodiment, the wiring board 30 is pcb board, has light source module assembling area 31 and a sense Optical module assembles area 32, wherein light source module assembling area 31 and illuminant module assembling area 32 pass through a flexible connection plate 33 are connected, so that the light source module 10 and the illuminant module 20 opposite can move freely, optimize the TOF mould group 300 Overall structure.Particularly, in the present invention, the TOF mould group 300 uses stacked design pattern, that is, the light source module 10 and the illuminant module 20 be in different height spaces, in this way, so that the ruler of the TOF mould group 300 Very little reduction, while location tolerance also opposite reduction between each component.It should be understood that the light source module 10 is assembled in institute The light source module assembling area 31 of wiring board 30 is stated, the illuminant module 20 is assembled in the described photosensitive of the wiring board 30 Module assembled area 32.
It is noted that for the ease of the heat dissipation of the even entire TOF mould group 300 of the light source module 10, the TOF The back portion region (opposite side with the 10 place face of light source module) of the wiring board 30 of mould group 300 is set cruelly Reveal in air, in order to radiate.Further, in one embodiment, it is set to the metallic conduction at 30 back side of wiring board For layer by partly exposed, the exposed region corresponds to the light source module 10, to further strengthen dissipating for the wiring board 30 Thermal effect.In another embodiment of the invention, the wiring board 30 further includes a heat-conducting plate 34, and the heat-conducting plate 34 is overlappingly It is set to the back side (opposite side with the 10 place face of light source module) of the wiring board 30, and can be conductively connected to The light source module 10 and the illuminant module 20, to reinforce the heat dissipation of the TOF mould group 300 by the heat-conducting plate 34 Performance.
The light source module 10 of the TOF imaging system further comprises a protective cover 14, wherein the metal Protective cover 14 is arranged on the outside of the laser emitter 12, and is used as a part of turning circuit.In other words, work as institute When the outside for stating protective cover 14 from the laser emitter 12 falls off, for the laser emitter to the light source module 10 The circuit of 12 power supplies is disconnected, and is terminated so that the light of the laser emitter 12 of the light source module 10 be made to excite or shine.This Outside, the protective cover 14 is arranged on the outside of the laser emitter 12, the shell as the laser emitter 12 Body also further provides certain protective effect for laser emitter 12.
The light source module 10 further comprises a diffraction optical element 15 (DOE), wherein the diffraction optical element 15 To change the phase and spatial-intensity of light wave produced by the laser emitter 12.It is to be appreciated that those skilled in the art that through The transmitting laser of ovennodulation not only has higher environment resistant jamming performance, conducive to the measurement essence for improving the TOF mould group 300 Degree, and the transmitting light wave through ovennodulation will not damage human eye.
Particularly, the protective cover 14 is installed in the wiring board 30, in the wiring board 30 and the gold Belong to and form a separate cavities between protective cover 14, wherein the laser emitter 12 and the diffractive optical element 15 be accommodated in it is described Separate cavities, and the optical window 142 by being set to 14 top of protective cover, control the exit direction of the laser.It is described Separate cavities cooperate the optical window 142, on the one hand, and the laser emitter 12 is isolated, prevents radiation pollution, on the other hand, Laser caused by the laser emitter 12 is only capable of supporting by the optical window 142 to the external world, limitedly to limit the laser Exit direction.
As shown in fig. 6, the light source module 10 of the TOF mould group 300 further comprises one drive circuit 16, wherein institute It states driving circuit 16 to be arranged between the power supply and the laser emitter 12, be sent out with controlling the power supply to the laser The power supply of emitter 12.Preferably, the driving circuit 16 can be connected with being powered with described control unit 62, so that the circuit Power supply of the power supply to the laser emitter 12 can be controlled according to the control instruction of described control unit 62.
In addition, the illuminant module 20 further comprises a camera lens 23, wherein the camera lens 23 includes an at least lens, Described in camera lens 23 be arranged on the illuminant module 20 the TOF light intensity sensor 21 outside, and correspond to described The photosensitive path of TOF light intensity sensor 21, to acquire the reflected beams that measured target surface is reflected by the camera lens.
The illuminant module 20 further comprises a retainer 24, wherein the retainer 24 is arranged for described in holding Camera lens 23 is in a position appropriate.Preferably, the position that the camera lens 23 is arranged on that the retainer 24 is formed is fixed In hole 240, to ensure that the camera lens 23 is in a predeterminated position.
The illuminant module 20 further includes a filter element 25, wherein the filter element 25 is set to the TOF light intensity Between sensor 21 and the camera lens 23, to filter veiling glare by the filter element 25, the survey of the TOF mould group 300 is improved Accuracy of measurement.
Certainly, the TOF imaging system further comprises a bracket 50, wherein the wiring board 30 is arranged on the branch Frame 50, so that the position of the wiring board 30 is fixed.Further, it is arranged on each electronic component of the wiring board 30 Position be also fixed, to realize the default layout of TOF mould group 300.
In an embodiment of the present invention, the TOF mould group 300 obtains the first of the target object 200 with time-of-flight method Beginning image data, the data processing unit 61 handle the original data to obtain the fusion of the target object 200 Data R, so that the TOF imaging system finally obtains the complete depth information of the target object 200.
As shown in Fig. 7 to Fig. 9, the present invention provides a depth integration method based on TOF imaging system, the depth integration The depth integration algorithm that method can be set to the working cell 60 by one is realized, in the imaging of the TOF imaging system In the process, the TOF imaging system by the depth map that is obtained under the lower depth map and short exposure obtained of the long exposure of fusion with The fused data R is obtained, wherein the fused data R can reflect the complete depth information of the target object 200.
Specifically, the data processing unit 61 communicatedly connects the TOF mould group 300, to handle the initial original Beginning data obtain the depth data under different exposure time, and merge the depth data to obtain the fused data R, to obtain Take the complete depth information of the target object 200.As shown in fig. 6, the data processing unit 61 includes a judgment module 611, wherein the judgment module 611 communicatedly connects the TOF light intensity sensor 21, and judge the target object currently obtained 200 type, the judgment basis of the type of the target object 200 referred to are the target object 200 apart from the TOF The distance of mould group 300.
Specifically, the TOF mould group 300 shoots at least one described target object 200, and the TOF light intensity passes Sensor 21 obtains the original data of the target object 200, and the judgment module 611 communicatedly connects the TOF light intensity Sensor 21, so that the judging unit 611 obtains the original data, the judging unit 611 is according to described initial Image data judge the target object 200 for remote scenario objects or nearly scenario objects, i.e., the described data processing unit 61 into Before row depth integration algorithm, first judge that the type of the target object 200 is remote scene type or nearly scene type.
It is noted that the judging unit 611 can according to the distance for the optical path for obtaining the original data come Judge the type of the target object 200, a criterion distance can be set in the judging unit 611, when the judging unit 611 When judging that the optical path distance is greater than the criterion distance, judge that current target object 200 for a remote scenario objects, is sentenced when described When disconnected unit 611 judges that the optical path distance is less than the criterion distance, judge current target object 200 for a near field scape pair As.The people for being familiar with this technology should be understood that the judgment method of the judging unit 611 is not limited to the present embodiment the judgement mentioned The judgment criteria of method, the judging unit 611 can also be manually set, and the present invention is unrestricted in this regard.
The data processing unit 61 further comprises an exposure data module 612, and the exposure data module 612 communicates Ground is connected to the judgment module 611, and according to the different types of exposure data of different types of Target Acquisition.The impression It is communicatively coupled to the TOF mould group 300 according to module 612, to obtain the different exposure datas.In addition noticeable It is that the exposure data module 612 includes a long exposure data submodule 6121 and a short exposure data submodule 6122, with The long exposure data L and short exposure data D of the target object 200 are obtained respectively.
It is worth noting that, the exposure data module 612 obtains different for different types of target object 200 Exposure data.Specifically, when the judging unit 61 judges current target object 200 for a remote scenario objects, the exposure Light data module 612 obtains the exposure data of the target object 200 in the form of distant view, at this point, the long exposure data submodule Block 6121 obtains the long exposure data L1 of double frequency of the target object 200.Correspondingly, when the judging unit 61 judges currently When target object 200 is a nearly scenario objects, the exposure data module 612 obtains the target object 200 in the form of close shot Exposure data, at this point, the long exposure data submodule 6121 obtains the long exposure data of single-frequency of the target object 200 L2。
Herein it should be noted that the double frequency refers to that the near infrared light to two kinds of frequencies is modulated demodulation, and Single-frequency is that demodulation is modulated to the near infrared light of single frequency, and the big feature of the one of double frequency near infrared light is that measuring distance is remote, therefore When the target object 200 is remote scenario objects, the long exposure data submodule 6121 obtains the long impression of the double frequency According to L1.Specifically, the time for exposure of the TOF mould group 300 and the frequency of near infrared light is arranged in the setting unit 64, from And make the TOF mould group 300 that can obtain the exposure of the target object 200 with single-frequency near infrared light and/or double frequency near infrared light Light data.Wherein, the TOF mould group 300 is implemented as with the data that the double frequency near infrared light obtains the target object 200 The long exposure data L1 of double frequency, when the TOF mould group 300 obtains the target object 200 with the single-frequency near infrared light Data are implemented as the long exposure data L2 of the single-frequency.It is noted that the long exposure data L includes current long exposure Time and current original data.
In addition, the short exposure data submodule 6122 be also set it is short to collect at least the one of the target object 200 Exposure data D.In an embodiment of the present invention, the short exposure data D is that the TOF mould group 300 is closely red with the single-frequency Outer light carries out the data of the target object 200 of acquisition when short exposure.When the short exposure data D includes current short exposure Between and current original data.In other words, the exposure data module 612 obtains presently described target object 200 The long exposure data L and short exposure data D.
The data processing unit 61 includes a computing module 613, and the computing module 613 is communicatively coupled to the exposure Light data module 612 obtains the target object 200 at least according to the exposure data to obtain the exposure data One depth data S.
It is worth noting that, the computing module 613 obtains the long exposure data L and the short exposure data respectively D, and a long short depth data S1 of depth data S2 and one is generated respectively.TOF algorithm is arranged in the i.e. described computing module 613, with The long exposure data L and the short exposure data D are separately converted to the depth data S.The computing module 613 according to The long depth data S2, analogously, the meter are implemented as according to the depth data S that the long exposure data L is obtained It calculates module 613 and is implemented as the short depth data S1 according to the short exposure data D depth data S obtained.
Herein it is noted that due to built-in end formula and personal computer platform difference, in the computing module 613, which calculate the depth data S, needs to carry out multicore operation judgement.Specifically, the computing module 613 judges the depth Whether the calculating of data S meets the requirement of real-time of calculating.When the computing unit 613 judges the calculating of the depth data S When not meeting requirement of real-time, the computing unit 613 divides the raw image data and calculates and obtain in such a way that multicore calculates Take the depth data S.
In other words, when the long exposure data L or the short exposure data D show the calculating of current depth data S When being unsatisfactory for requirement of real-time, the computing module 613 divides the long exposure data L or short exposure data D, with same The calculating of depth data S described in Shi Jinhang, to improve the computational efficiency of the depth data S.
In addition, it is noted that due to some built-in end formula platform fast cache resources finiteness, the TOF light The original data that strong sensor 21 obtains is stored at least one at a slow speed in caching, and the caching at a slow speed is implemented as The physical structure of DDR etc.The i.e. described long exposure data L and short exposure data D is stored in the caching at a slow speed, And in the calculating process of the depth data S, the computing module 613 will be stored in institute in the way of direct memory access The original data stated in caching at a slow speed extracts an at least fast cache, and calculates and obtain in the fast cache The depth data S, the depth data S again from pipetted in the fast cache to it is described at a slow speed caching in be stored.It is described Fast cache is implemented as the high-speed internal memory (such as the physical structures such as Cache) of chip.
In other words, the computing module 613 calculates according to the long exposure data L and obtains the long depth data S2, and It is calculated according to the short exposure data D and obtains the short depth data S1, the long depth data S2 is shown in the long time for exposure Under the obtained depth information of the target object 200, the short depth data S1 is shown in the institute obtained under short exposure time State the depth information of target object 200.
In addition, the data processing unit 61 further comprises an adjustment module in order to adjust the processing depth data S 614, the adjustment module 614 is applied to adjust the depth data S, specifically, the adjustment module 614 is applied to Adjust the short depth data S1.It is noted that the adjustment module 614 further comprises an at least the first adjustment module 6141, and an at least second adjustment module 6142, the short depth data S1 store to the first adjustment module 6141, institute Long depth data S2 is stated to store to the second adjustment module 6142.
When the depth data S is implemented as the short depth data S1, the first adjustment module 6141 is further Judge whether to the calculating of automatic exposure algorithm, if the depth data is the long depth data S2, the second adjustment Module 6142 need not be judged.It is noted that the automatic exposure algorithm is referred to present mode (double frequency or list Frequently the time for exposure under) is adjusted, so that the not overexposure of the TOF mould group 300, to improve the target object The integrality of the 200 fused data R, the phenomenon that loss to avoid length exposure area information.
At this point, the first adjustment module 6141 communicatedly connects an automatic exposure module 616, the automatic exposure module 616 are calculated the automatic exposure time with automatic exposure algorithm.It is worth noting that, the first adjustment module 6141 judges Whether presently described short depth data S1 meets the requirements, when the short depth data S1 is undesirable, the automatic exposure Module 616 obtains the automatic exposure time, to change the current short exposure time.However work as the short exposure time When being substituted for the automatic exposure time, correspondingly, the short exposure data D is substituted, the short depth data S1 It is adjusted.Certainly, not all short depth data S1 require carry out automatic exposure adjustment, automatic exposure adjustment be in order to Make the depth information of the target object 200 more complete.
In addition, the data processing unit 61 further comprises a Fusion Module 615, the Fusion Module 615 merges institute Long depth data S2 and the short depth data S1 are stated to obtain the fused data R, to obtain the target object 200 Complete depth information.It is noted that the Fusion Module 615 is communicatively coupled to the setting unit 64, it is described The setting of setting unit 64 stores the threshold value Y of presently described TOF mould group 300, the threshold value Y be implemented as under long exposure mode compared with Nearly object is in the depth value of optimum depth effect, and the Fusion Module 615 obtains the threshold value Y, and is obtained based on the threshold value Y Take the fused data R.It is noted that the threshold value Y can be adjusted, the threshold value Y can pass through the target object 200 Actual depth effect determine that the setting of the threshold value Y and the material of the target object 200 have a relationship, therefore the threshold value Y It can be trimmed off so that the fused data R reaches the more figure of merit.
Specifically, the corresponding number of the corresponding coordinate pixel of the long depth data S2 is worked as in the judgement of Fusion Module 615 Value is greater than the threshold value Y, then is the long depth data S2 by the coordinate pixel assignment, when the long depth data S2 is corresponding The corresponding numerical value of coordinate pixel be less than or equal to the threshold value Y when, the coordinate pixel is implemented as the short depth data S1, to obtain the fused data R.The short depth data S1 and the long depth data are merged in a complementary fashion S2, to obtain the fused data R for the complete depth information that can reflect the target object 200.
It is noted that the working cell 60 includes the data-interface 63, the data-interface 63 communicatedly connects It is connected to the data processing unit 61, sends the fused data R to an at least host computer, so that the host computer It is handled in next step using fused data R progress.
In addition, the setting unit 64 can be applied to that a series of running parameter of TOF mould groups 300 is arranged, for example, The running parameter includes the frame per second and exposure upper-limit of the TOF mould group 300, specifically, the frame per second of the TOF mould group 300 Setting with exposure upper-limit is in order to which the laser to the light source module 10 carries out radium-shine protection.The running parameter includes described The setting of threshold value Y, so that the Fusion Module 615 can carry out the fusion of depth map according to the threshold value Y.It is familiar with this technology People should be understood that the present invention it is unrestricted in this regard.
According to another aspect of the present invention, the present invention provides a depth integration method based on TOF imaging system, described TOF imaging system is applied to obtain the depth information of an at least target object 200, comprising the following steps:
At least one long exposure data L and at least one that S1: one exposure data module 612 obtains the target object 200 is short Exposure data D;
S2: one computing module 613 obtains an at least depth data S for the target object 200, wherein the depth data S Including at least one short depth data S1 and at least one long depth data S2;And
S3: one Fusion Module 615 merges the short depth data S1 and long depth data S2, to obtain the target An at least fused data R for object 200.
The depth integration method based on TOF imaging system merges the short depth data S1 and the long depth number Make the TOF imaging system that can obtain institute by this method to obtain the fused data R of the target object 200 according to S2 State the complete depth information of target object 200.Specifically, the TOF imaging system includes a TOF mould group 300 and one Working cell 60, the TOF mould group 300 obtain exposure data described in the multiple groups of the target object 200, it is worth mentioning at this point that, The exposure data includes the original data under a time for exposure and current exposure time.The working cell 60 is handled And merge the exposure data and obtain the fused data R, so that the TOF imaging system obtains the target object 200 complete depth information.
In the step S1, it further includes steps of
S11: one judgment module 611 judges the type of the target object 200, when the target object 200 is a far field scape pair As when execute step S12, when the target object 200 be a nearly scenario objects when execute step S13;
S12: the exposure data module 612 obtain the target object 200 the long exposure data L1 of an at least double frequency and The short exposure data D;And
S13: the exposure data module 612 obtain the target object 200 the long exposure data L2 of an at least single-frequency and The short exposure data D.
It is noted that the judging unit 611 can according to the distance for the optical path for obtaining the original data come Judge the type of the target object 200, a criterion distance can be set in the judging unit 611, when the judging unit 611 When judging that the optical path distance is greater than the criterion distance, judge that current target object 200 for a remote scenario objects, is sentenced when described When disconnected unit 611 judges that the optical path distance is less than the criterion distance, judge current target object 200 for a near field scape pair As.The people for being familiar with this technology should be understood that the judgment method of the judging unit 611 is not limited to the present embodiment the judgement mentioned The judgment criteria of method, the judging unit 611 can also be manually set, and the present invention is unrestricted in this regard.
Herein it should be noted that the double frequency refers to that the near infrared light to two kinds of frequencies is modulated demodulation, and Single-frequency is that demodulation is modulated to the near infrared light of single frequency, and the big feature of the one of double frequency near infrared light is that measuring distance is remote.Institute It states TOF mould group 300 and is implemented as the long exposure of the double frequency with the data that the double frequency near infrared light obtains the target object 200 Data L1, when the TOF mould group 300 is implemented as institute with the data that the single-frequency near infrared light obtains the target object 200 State the long exposure data L2 of single-frequency.The short exposure data D is that the TOF mould group 300 is short with single-frequency near infrared light progress The data of the target object 200 obtained when exposure.
In the step S2, the computing module 613 needs to carry out multicore operation judgement before calculating the depth data S, Specifically, the step S2 is further included steps of
S21: the computing module 613 obtains the long exposure data L and short exposure data D respectively;
S22: the computing module 613 judges whether to need according to the long exposure data L and short exposure data D Multicore operation is carried out, when needing to be implemented multicore operation, executes step S23;And
S23: the segmentation long exposure data L and/or short exposure data D, multicore operation obtain the depth data S。
It is noted that in the step S23, the long exposure data L and short exposure data D be stored in Few one at a slow speed in caching, and the caching at a slow speed is implemented as the physical structure of DDR etc.The i.e. described long exposure data L and institute Short exposure data D is stated to be stored in the caching at a slow speed, and in the calculating process of the depth data S, the calculating mould Block 613 will be stored in the long exposure data L and/or short exposure in the caching at a slow speed in the way of direct memory access Data D extracts an at least fast cache, and calculates in the fast cache and obtain the depth data S, the depth number According to S again from pipetted in the fast cache to it is described at a slow speed caching in be stored.The fast cache is implemented as the height of chip Fast memory (such as the physical structures such as Cache).By this method, the efficiency that the depth data S is calculated is improved.
It is worth noting that, the computing module 613 obtains the long exposure data L and the short exposure data respectively D, and the long depth data S2 and short depth data S1 is generated respectively.The i.e. described computing module 613 is arranged TOF and calculates The long exposure data L and the short exposure data D are separately converted to the depth data S by method.The computing module 613 are implemented as the long depth data S2, analogously, institute according to the long exposure data L depth data S obtained It states computing module 613 and is implemented as the short depth data S1 according to the short exposure data D depth data S obtained.
The depth integration method based on TOF imaging system, further includes steps of after the step S2
S4: the adjustment short depth data S1.
Wherein the working cell includes the adjustment module 614, and the adjustment module 614 is communicatively coupled at least one The automatic exposure time is calculated with automatic exposure algorithm in automatic exposure module 616, the automatic exposure module 616.It is worth It is noted that institute, adjustment module 614 judges whether presently described short depth data S1 meets the requirements, when the short depth data When S1 is undesirable, the automatic exposure module 616 obtains the automatic exposure time, to change the current short exposure Between light time.However when the short exposure time was substituted for the automatic exposure time, correspondingly, the short exposure data D is substituted, and the short depth data S1 is also adjusted.Certainly, not all short depth data S1 requires to carry out automatic Exposure adjustment, automatic exposure adjustment is to make the depth information of the target object 200 more complete.
The step S4 is further included steps of
S41: judging whether the short depth data S1 meets the requirements, if not being inconsistent standardization thens follow the steps S42;
S42: one automatic exposure module 615 of starting is to adjust the short exposure data D;And
S43: the short depth data S1 is adjusted according to the short exposure data D.
It is noted that complying with standard for the short depth data S1 is set according to actual conditions, the automatic exposure Optical module 615 obtains the automatic time for exposure with automatic exposure algorithm, and the automatic exposure algorithm is referred to present mode (double frequency Or single-frequency) under time for exposure be adjusted so that the time for exposure of the TOF mould group 300 is unlikely to too long, to mention The integrality of the fused data R of the high target object 200, the phenomenon that loss to avoid length exposure area information.
The Fusion Module 615 merges the long depth data S2 and the short depth data S1 to obtain the fusion Data R, to obtain the complete depth information of the target object 200.It is noted that the Fusion Module 615 is logical It is connected to letter a setting unit 64, the setting unit 64 stores the threshold value Y, the threshold value Y of presently described TOF mould group 300 It is implemented as the depth value that the closer objects under long exposure mode are in optimum depth effect.
Specifically, the corresponding number of the corresponding coordinate pixel of the long depth data S2 is worked as in the judgement of Fusion Module 615 Value is greater than the threshold value Y, then is the long depth data S2 by the coordinate pixel assignment, when the long depth data S2 is corresponding The corresponding numerical value of coordinate pixel be less than or equal to the threshold value Y when, the coordinate pixel is implemented as the short depth data S1, to obtain the fused data R.The short depth data S1 and the long depth data are merged in a complementary fashion S2, to obtain the fused data R for the complete depth information that can reflect the target object 200.
The step S3 is further included steps of
S31: when the corresponding numerical value of the corresponding coordinate pixel of the long depth data S2 is greater than a threshold value, then the coordinate picture Element is assigned a value of the long depth data S2;And
S32: when the corresponding numerical value of the corresponding coordinate pixel of the long depth data S2 is less than or equal to the threshold value Y, then The coordinate pixel assignment is the short depth data S1.
In addition, the threshold value Y's sets it is noted that the threshold value Y can be reset according to actual depth effect Fixed and practical application scene and the material of target object 200 have very big relationship.
It based on the depth integration method of TOF imaging system further comprise following before the step S1 in addition, described Step:
S0: the optical power parameter of a setting at least TOF light intensity sensor 21.
Specifically, the S0 is the following steps are included: the work of the TOF mould group 300 is arranged by a setting unit 64 Parameter, for example, the running parameter includes the frame per second and exposure upper-limit of the TOF mould group 300, specifically, the TOF mould group The setting of 300 frame per second and exposure upper-limit is in order to which the laser to the light source module 10 carries out radium-shine protection.The work ginseng Number includes the setting of the threshold value Y, so that the Fusion Module 615 can carry out the fusion of depth map according to the threshold value Y.It is ripe The people for knowing this technology should be understood that the present invention is unrestricted in this regard.
It is described to be further included steps of after the step S3 based on the depth integration method of TOF imaging system
S5: the fused data R is transmitted to an at least host computer by a data-interface 63.
The data-interface 63 is communicatively coupled to the data processing unit 61, and the fused data R is sent to An at least host computer, so that the host computer can carry out handling in next step using the fused data R.For example, passing through The fused data R is transferred to host computer by one MIPI data-interface.The host computer can be implemented as all kinds of built-in end formulas and set It is standby, such as computer, robot, mobile phone etc..
In addition, it should be understood by those skilled in the art that foregoing description and the embodiment of the present invention shown in the drawings are only made The present invention is not intended to limit for citing.The purpose of the present invention has been fully and effectively achieved.Function of the invention and structure are former Reason has been shown in embodiment and explanation, and under without departing from the principle, embodiments of the present invention can have any deformation Or modification.

Claims (20)

1. the depth integration method based on TOF imaging system, wherein the TOF imaging system is applied to obtain at least one The depth information of target object, which comprises the following steps:
S1: one exposure data module obtains at least one long exposure data and an at least short exposure data of the target object;
S2: one computing module obtains an at least depth data for the target object, wherein the depth data includes at least one short Depth data and at least one long depth data;And
S3: one Fusion Module merges the short depth data and the long depth data, to obtain the target object at least One fused data.
2. the depth integration method according to claim 1 based on TOF imaging system, wherein the TOF imaging system exists It is further included steps of after the step S2
S4: the adjustment short depth data.
3. the depth integration method according to claim 2 based on TOF imaging system, wherein the step S4 is further The following steps are included:
S41: judging whether the short depth data meets the requirements, if not being inconsistent standardization thens follow the steps S41;
S42: one automatic exposure module of starting is to adjust the short exposure data;And
S43: the short depth data is adjusted according to the short exposure data.
4. according to claim 1 or 3 any depth integration methods based on TOF imaging system, wherein the step S1 It further includes steps of
S11: one judgment module judges the type of the target object, executes step when the target object is a remote scenario objects S12 executes step S13 when the target object is a nearly scenario objects;
S12: the exposure data module obtains the long exposure data of an at least double frequency and the short exposure number of the target object According to;And
S13: the exposure data module obtains the long exposure data of an at least single-frequency and the short exposure number of the target object According to.
5. the depth integration method according to claim 4 based on TOF imaging system, wherein the step S2 is further The following steps are included:
S21: the computing module obtains the long exposure data and the short exposure data respectively;
S22: the computing module judges whether to need to carry out multicore according to the long exposure data and the short exposure data Operation executes step S23 when needing to be implemented multicore operation;And
S23: dividing the long exposure data and/or the short exposure data, and multicore operation obtains the depth data.
6. the depth integration method according to claim 5 based on TOF imaging system, wherein the step S3 is further The following steps are included:
S31: when the corresponding numerical value of the corresponding coordinate pixel of the long depth data is greater than a threshold value, then the coordinate pixel assignment For the long depth data;And
S32: when the corresponding numerical value of the corresponding coordinate pixel of the long depth data is less than or equal to the threshold value, then the coordinate Pixel assignment is the short depth data.
7. the depth integration method according to claim 6 based on TOF imaging system, wherein the threshold value is implemented as Closer objects are in the depth value of optimum depth effect under long exposure mode.
8. the depth integration method according to claim 4 based on TOF imaging system, wherein the long impression of double frequency According to the data for being implemented as the TOF mould group at least a pair of frequency near infrared light and obtaining the target object, wherein the double frequency Near infrared light is implemented as the near infrared light obtained after demodulating to two kinds of frequency modulation(PFM)s.
9. the depth integration method according to claim 8 based on TOF imaging system, wherein the long impression of single-frequency According to the data for being implemented as the TOF mould group with an at least single-frequency near infrared light and obtaining the target object.
10. the depth integration method according to claim 9 based on TOF imaging system, wherein the long exposure data packet The original data of at least one long time for exposure and the corresponding long time for exposure are included, the short exposure data includes at least The original data of one short exposure time and the corresponding short exposure time.
11. the depth integration method according to claim 5 based on TOF imaging system, wherein in the step S23, institute It states long exposure data and short exposure data is stored at least one and caches at a slow speed, the computing module utilizes direct memory access The long exposure data that will be stored in the caching at a slow speed of mode and/or short exposure data to extract at least one quickly slow It deposits, and is calculated in the fast cache and obtain the depth data.
12. the depth integration method according to claim 11 based on TOF imaging system, wherein described to cache quilt at a slow speed It is embodied as an at least Double Data Rate synchronous DRAM, the fast cache is implemented as an at least caches Device.
13. the depth integration method according to claim 10 based on TOF imaging system, wherein in the step S23, The long exposure data and short exposure data are stored at least one and cache at a slow speed, and the computing module is deposited using direct memory The long exposure data being stored in the caching at a slow speed and/or short exposure data are extracted at least one quickly by the mode taken Caching, and calculated in the fast cache and obtain the depth data.
14. the depth integration method according to claim 13 based on TOF imaging system, wherein described to cache quilt at a slow speed It is embodied as an at least Double Data Rate synchronous DRAM, the fast cache is implemented as an at least caches Device.
15. according to claim 1 to 3 any depth integration methods based on TOF imaging system, wherein the depth Fusion method further includes steps of before the step S1
S0: the optical power parameter of a setting at least TOF light intensity sensor.
16. according to claim 1 or 3 any depth integration methods based on TOF imaging system, wherein the depth Fusion method further includes steps of after the step S3
S5: the fused data is transmitted to an at least host computer by a data-interface.
17. the depth integration method according to claim 15 based on TOF imaging system, wherein the TOF imaging system Including a setting unit, the setting unit is arranged in at least frame per second and at least one exposure of the TOF light intensity sensor Limit, the frame per second and the exposure upper-limit form the optical power parameter.
18. the depth integration method according to claim 16 based on TOF imaging system, wherein the host computer is by reality It applies as all kinds of built-in end formula equipment, one kind or combinations thereof selected from computer, robot or mobile phone.
19. the depth integration method according to claim 6 based on TOF imaging system, wherein the threshold value can be according to institute Fused data is stated to be adjusted.
20. a TOF camera, which is characterized in that the TOF camera is with any one claim in claim 1 to 19 The depth integration method obtains the depth information of a target object.
CN201711170732.5A 2017-11-22 2017-11-22 Depth fusion method based on TOF imaging system and TOF camera Active CN109819173B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711170732.5A CN109819173B (en) 2017-11-22 2017-11-22 Depth fusion method based on TOF imaging system and TOF camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711170732.5A CN109819173B (en) 2017-11-22 2017-11-22 Depth fusion method based on TOF imaging system and TOF camera

Publications (2)

Publication Number Publication Date
CN109819173A true CN109819173A (en) 2019-05-28
CN109819173B CN109819173B (en) 2021-12-03

Family

ID=66601169

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711170732.5A Active CN109819173B (en) 2017-11-22 2017-11-22 Depth fusion method based on TOF imaging system and TOF camera

Country Status (1)

Country Link
CN (1) CN109819173B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110708518A (en) * 2019-11-05 2020-01-17 北京深测科技有限公司 People flow analysis early warning dispersion method and system
CN110794422A (en) * 2019-10-08 2020-02-14 歌尔股份有限公司 Robot data acquisition system and method with TOF imaging module
CN111246120A (en) * 2020-01-20 2020-06-05 珊口(深圳)智能科技有限公司 Image data processing method, control system and storage medium for mobile device
CN112073646A (en) * 2020-09-14 2020-12-11 哈工大机器人(合肥)国际创新研究院 Method and system for TOF camera long and short exposure fusion
CN112711324A (en) * 2019-10-24 2021-04-27 浙江舜宇智能光学技术有限公司 Gesture interaction method and system based on TOF camera
WO2021093502A1 (en) * 2019-11-12 2021-05-20 Oppo广东移动通信有限公司 Phase difference obtaining method and apparatus, and electronic device
CN113038028A (en) * 2021-03-24 2021-06-25 浙江光珀智能科技有限公司 Image generation method and system
CN113794826A (en) * 2021-09-28 2021-12-14 浙江科技学院 Light intensity modulation interference method and system for accurately pointing laser interference
WO2023245906A1 (en) * 2022-06-24 2023-12-28 奥比中光科技集团股份有限公司 Tof sensor-based sweeping robot obstacle avoidance and navigation method and apparatus, and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102937425A (en) * 2012-10-18 2013-02-20 北京航空航天大学 Measuring system of three-dimensional shape of strong reflecting surface based on high dynamic strip projector
CN104918035A (en) * 2015-05-29 2015-09-16 深圳奥比中光科技有限公司 Method and system for obtaining three-dimensional image of target
CN104935911A (en) * 2014-03-18 2015-09-23 华为技术有限公司 Method and device for high-dynamic-range image synthesis
CN106204732A (en) * 2016-07-21 2016-12-07 深圳市易尚展示股份有限公司 The three-dimensional rebuilding method of dynamic exposure and system
CN106910242A (en) * 2017-01-23 2017-06-30 中国科学院自动化研究所 The method and system of indoor full scene three-dimensional reconstruction are carried out based on depth camera
US20170289515A1 (en) * 2016-04-01 2017-10-05 Intel Corporation High dynamic range depth generation for 3d imaging systems
CN107241559A (en) * 2017-06-16 2017-10-10 广东欧珀移动通信有限公司 Portrait photographic method, device and picture pick-up device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102937425A (en) * 2012-10-18 2013-02-20 北京航空航天大学 Measuring system of three-dimensional shape of strong reflecting surface based on high dynamic strip projector
CN104935911A (en) * 2014-03-18 2015-09-23 华为技术有限公司 Method and device for high-dynamic-range image synthesis
CN104918035A (en) * 2015-05-29 2015-09-16 深圳奥比中光科技有限公司 Method and system for obtaining three-dimensional image of target
US20170289515A1 (en) * 2016-04-01 2017-10-05 Intel Corporation High dynamic range depth generation for 3d imaging systems
CN106204732A (en) * 2016-07-21 2016-12-07 深圳市易尚展示股份有限公司 The three-dimensional rebuilding method of dynamic exposure and system
CN106910242A (en) * 2017-01-23 2017-06-30 中国科学院自动化研究所 The method and system of indoor full scene three-dimensional reconstruction are carried out based on depth camera
CN107241559A (en) * 2017-06-16 2017-10-10 广东欧珀移动通信有限公司 Portrait photographic method, device and picture pick-up device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王因明: "《高等学校试用教材 光学计量仪器设计 下》", 28 February 1982 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110794422A (en) * 2019-10-08 2020-02-14 歌尔股份有限公司 Robot data acquisition system and method with TOF imaging module
CN112711324B (en) * 2019-10-24 2024-03-26 浙江舜宇智能光学技术有限公司 Gesture interaction method and system based on TOF camera
CN112711324A (en) * 2019-10-24 2021-04-27 浙江舜宇智能光学技术有限公司 Gesture interaction method and system based on TOF camera
CN110708518A (en) * 2019-11-05 2020-01-17 北京深测科技有限公司 People flow analysis early warning dispersion method and system
WO2021093502A1 (en) * 2019-11-12 2021-05-20 Oppo广东移动通信有限公司 Phase difference obtaining method and apparatus, and electronic device
CN111246120A (en) * 2020-01-20 2020-06-05 珊口(深圳)智能科技有限公司 Image data processing method, control system and storage medium for mobile device
CN111246120B (en) * 2020-01-20 2021-11-23 珊口(深圳)智能科技有限公司 Image data processing method, control system and storage medium for mobile device
CN112073646A (en) * 2020-09-14 2020-12-11 哈工大机器人(合肥)国际创新研究院 Method and system for TOF camera long and short exposure fusion
CN112073646B (en) * 2020-09-14 2021-08-06 哈工大机器人(合肥)国际创新研究院 Method and system for TOF camera long and short exposure fusion
CN113038028A (en) * 2021-03-24 2021-06-25 浙江光珀智能科技有限公司 Image generation method and system
CN113038028B (en) * 2021-03-24 2022-09-23 浙江光珀智能科技有限公司 Image generation method and system
CN113794826A (en) * 2021-09-28 2021-12-14 浙江科技学院 Light intensity modulation interference method and system for accurately pointing laser interference
WO2023245906A1 (en) * 2022-06-24 2023-12-28 奥比中光科技集团股份有限公司 Tof sensor-based sweeping robot obstacle avoidance and navigation method and apparatus, and storage medium

Also Published As

Publication number Publication date
CN109819173B (en) 2021-12-03

Similar Documents

Publication Publication Date Title
CN109819173A (en) Depth integration method and TOF camera based on TOF imaging system
US9432593B2 (en) Target object information acquisition method and electronic device
CN106454090B (en) Atomatic focusing method and system based on depth camera
CN110455258A (en) A kind of unmanned plane Terrain Clearance Measurement method based on monocular vision
US10916025B2 (en) Systems and methods for forming models of three-dimensional objects
CN109831660A (en) Depth image acquisition method, depth image obtaining module and electronic equipment
JP2023509137A (en) Systems and methods for capturing and generating panoramic 3D images
CN105765558A (en) Low power eye tracking system and method
CN108733419A (en) Lasting awakening method, device, smart machine and the storage medium of smart machine
CN107783353A (en) For catching the apparatus and system of stereopsis
CN108140066A (en) Drawing producing device and drawing production method
CN109819174A (en) Automatic explosion method and automatic exposure time calculation method and TOF camera based on TOF imaging system
CN107111764A (en) By the event of depth triggering of the object in the visual field of imaging device
CN106254738A (en) Dual image acquisition system and image-pickup method
CN109885053A (en) A kind of obstacle detection method, device and unmanned plane
CN108924408A (en) A kind of Depth Imaging method and system
CN206399422U (en) Multifunctional vision sensor and mobile robot
CN108881717A (en) A kind of Depth Imaging method and system
CN107205111A (en) Camera device, mobile device, camera system, image capture method and recording medium
CN209991983U (en) Obstacle detection equipment and unmanned aerial vehicle
CN109129477A (en) A kind of 3 D positioning system based on binocular welding manipulator
US11503269B2 (en) Hybrid imaging system for underwater robotic applications
CN111654626B (en) High-resolution camera containing depth information
CN106382920A (en) Multifunctional visual sensor, mobile robot and control method of mobile robot
CN108347561B (en) Laser guide scanning system and scanning method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant