CN112684466A - Distance measuring method, depth camera, electronic device, and storage medium - Google Patents

Distance measuring method, depth camera, electronic device, and storage medium Download PDF

Info

Publication number
CN112684466A
CN112684466A CN202011350400.7A CN202011350400A CN112684466A CN 112684466 A CN112684466 A CN 112684466A CN 202011350400 A CN202011350400 A CN 202011350400A CN 112684466 A CN112684466 A CN 112684466A
Authority
CN
China
Prior art keywords
pulse signal
optical pulse
frequency intensity
target object
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202011350400.7A
Other languages
Chinese (zh)
Inventor
鞠晓山
李宗政
冯坤亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangxi Oumaisi Microelectronics Co Ltd
Original Assignee
Jiangxi Oumaisi Microelectronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangxi Oumaisi Microelectronics Co Ltd filed Critical Jiangxi Oumaisi Microelectronics Co Ltd
Priority to CN202011350400.7A priority Critical patent/CN112684466A/en
Publication of CN112684466A publication Critical patent/CN112684466A/en
Withdrawn legal-status Critical Current

Links

Images

Landscapes

  • Measurement Of Optical Distance (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The application provides a distance measuring method based on flight time, which comprises the following steps: emitting a light pulse signal to a target object, wherein the light pulse signal has a preset co-frequency intensity, and the co-frequency intensity comprises a pulse width and a pulse wavelength; receiving the optical pulse signal reflected back by the target object and converting the optical pulse signal into an electrical signal; converting the electrical signal to obtain the co-frequency intensity of the received optical pulse signal; acquiring the co-frequency intensity variation of the optical pulse signal according to the co-frequency intensity of the transmitted optical pulse signal and the co-frequency intensity of the received optical pulse signal; and acquiring the distance of the target object according to the co-frequency intensity variation, so that the distance of the target object can be acquired according to the co-frequency intensity variation of the optical pulse signal. The application also provides a depth camera, an electronic device and a storage medium.

Description

Distance measuring method, depth camera, electronic device, and storage medium
Technical Field
The application relates to the technical field of time-of-flight ranging, in particular to a time-of-flight-based distance measuring method, a depth camera, electronic equipment and a storage medium.
Background
Tof (time of flight), which is broadly understood as a technique for further understanding certain properties of ions or media by measuring the time it takes for an object, particle, or wave to fly a certain distance in a fixed medium (both medium/distance/time being known or measurable). The TOF ranging method belongs to a two-way ranging technology and mainly utilizes the time of flight of a signal back and forth between two reflected surfaces to measure the distance between nodes. Conventional ranging techniques are classified into two-way ranging techniques and one-way ranging techniques.
Conventional electronic devices mostly use phase interferometry to perform distance measurement, i.e. a light source modulates a pulse transmission signal with a frequency to obtain a reflection signal, and a mixer (mixer) obtains the distance according to the phase change in the transmission signal. However, in the process of implementing the present invention, the inventor finds that at least the following problems exist in the prior art: the phase interference method has high requirements on the measurement environment, is very sensitive to the background noise during measurement, and is easy to fail to analyze the phase due to the noise, so that accurate distance information cannot be acquired.
Disclosure of Invention
In view of the above problems, the present application provides a distance measuring method based on time of flight, a depth camera, an electronic device, and a storage medium to solve the above problems.
A first aspect of the present application provides a time-of-flight distance measurement method, the method comprising:
emitting a light pulse signal to a target object, wherein the light pulse signal has a preset co-frequency intensity, and the co-frequency intensity comprises a pulse width and a pulse wavelength;
receiving the optical pulse signal reflected back by the target object and converting the optical pulse signal into an electrical signal;
converting the electrical signal to obtain the co-frequency intensity of the received optical pulse signal;
acquiring the co-frequency intensity variation of the optical pulse signal according to the co-frequency intensity of the transmitted optical pulse signal and the co-frequency intensity of the received optical pulse signal;
and acquiring the distance of the target object according to the co-frequency intensity variation.
Therefore, the distance of the target object is obtained by transmitting a plurality of optical pulse signals and receiving corresponding optical pulse signals according to the co-frequency intensity change of the transmitted optical pulse signals and the received optical pulse signals, and the measuring mode has low requirement on measuring environment and wide applicable scenes.
In an embodiment of the present application, the emitting a plurality of optical pulse signals to a target object specifically includes:
providing a pulse signal with preset co-frequency intensity so that the light source emits a pulse light beam with preset co-frequency intensity;
the pulsed light beam is spatially modulated to form a plurality of light pulse signals that are emitted outward.
Thus, the non-floodlight carrier beam with uneven intensity distribution is formed and emitted outwards through time modulation and spatial modulation.
In an embodiment of the present application, the converting the electrical signal to obtain the co-frequency intensity of the received optical pulse signal specifically includes:
converting the electrical signal with an AC/DC converter through a band pass filter;
and acquiring the potential height of the electric signal through a counter, and acquiring the actual co-frequency intensity of the optical pulse signal according to the potential height.
Thus, the signal is converted to read the co-frequency intensity of the received optical pulse signal.
In an embodiment of the present application, the obtaining the distance to the target object according to the covariance intensity variation specifically includes:
executing the following formula to obtain the distance of the target object:
ΔT=GVD*L*Δλ;
where Δ T is a pulse width difference between the transmitted optical pulse signal and the received optical pulse signal, GVD is group velocity dispersion, L is a distance of the target object, and Δ λ is a pulse wavelength difference between the transmitted optical pulse signal and the received optical pulse signal.
A second aspect of the present application provides a depth camera comprising:
the transmitting module is used for transmitting a plurality of optical pulse signals to a target object, wherein the optical pulse signals have preset co-frequency intensity, and the co-frequency intensity comprises pulse width and pulse wavelength;
the acquisition module comprises an image sensor consisting of at least one pixel, and the image sensor receives the light pulse signals reflected back by the target object and converts the light pulse signals into electric signals;
the conversion module comprises a plurality of passive elements and is used for converting the electric signals so as to acquire the co-frequency intensity of the received optical pulse signals;
and the calculation module is used for acquiring the variation according to the co-frequency intensity of the transmitted optical pulse signal and the co-frequency intensity of the received optical pulse signal and acquiring the distance of the target object according to the variation.
In an embodiment of the present application, the transmitting module includes:
a light source for emitting a light beam of a single wavelength or a dual wavelength;
the time light modulator is used for providing a pulse signal with preset co-frequency intensity so as to control the light source to emit a pulse light beam with preset co-frequency intensity;
and the spatial light modulator is used for spatially modulating the pulse light beam to form an outward-emitted light pulse signal.
In an embodiment of the present application, the conversion module includes:
a band-pass filter;
an AC/DC converter cooperating with the band-pass filter to convert the electrical signal;
and the counter is used for acquiring the potential of the electric signal and acquiring the actual co-frequency intensity of the optical pulse signal according to the potential.
In an embodiment of the present application, the calculation module executes the following formula to obtain the distance of the target object:
ΔT=GVD*L*Δλ;
where Δ T is a pulse width difference between the transmitted optical pulse signal and the received optical pulse signal, GVD is group velocity dispersion, L is a distance of the target object, and Δ λ is a pulse wavelength difference between the transmitted optical pulse signal and the received optical pulse signal.
A third aspect of the present application provides an electronic device, comprising:
the device comprises a memory, a processor and a communication bus, wherein the memory is in communication connection with the processor through the communication bus; and a plurality of program modules are stored in the memory, and loaded by the processor and executing the distance measuring method as described above.
A fourth aspect of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a distance measurement method as described above.
According to the distance measuring method, the depth camera, the electronic equipment and the storage medium, the distance of the target object is obtained according to the co-frequency intensity change of the transmitted optical pulse signals and the received optical pulse signals by transmitting the plurality of optical pulse signals and receiving the corresponding optical pulse signals, the measuring mode is low in requirement on measuring environment, and the distance measuring method is suitable for scenes such as intelligent driving.
Drawings
Fig. 1 is a schematic diagram of a depth camera provided in a first embodiment of the present application.
Fig. 2 is a schematic diagram of a transmitting module according to an embodiment of the present application.
Fig. 3 is a schematic diagram of a transmitting module according to another embodiment of the present application.
Fig. 4 is a schematic diagram of a conversion module according to an embodiment of the present application.
Fig. 5 is a timing waveform diagram of an emitted optical pulse signal and a received optical pulse signal in an embodiment of the application.
FIG. 6 is a flow chart of a method for distance measurement according to an embodiment of the present application.
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order that the objects, features and advantages of the present application can be more clearly understood, the present application will be described in detail below with reference to the accompanying drawings and specific embodiments. It should be noted that the embodiments and features of the embodiments of the present application may be combined with each other without conflict. In the following description, numerous specific details are set forth to provide a thorough understanding of the present application, and the described embodiments are merely a subset of the embodiments of the present application and are not intended to be a complete embodiment.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
Referring to fig. 1, fig. 1 is a schematic diagram of a depth camera 100 according to an embodiment of the present disclosure. The depth camera 100 is a TOF depth camera that can be used for distance measurements to accurately sense the surrounding environment and changes.
In an embodiment of the present application, the depth camera 100 includes a transmitting module 10, an acquisition module 20, a conversion module 30, and a calculation module 40.
The transmitting module 10 is used for transmitting a dual-wavelength optical pulse signal to the target object 200, the dual-wavelength optical pulse signal generally includes a short-wavelength laser beam and a long-wavelength laser beam, and such momentum strongly-correlated dual-wavelength optical beam has a strong anti-interference capability. After the long-wavelength laser beam flies for a long distance, the intensity of the corresponding first co-frequency is basically kept unchanged; after the short-wavelength laser beam flies for a longer distance, the intensity of the corresponding second co-frequency becomes smaller. It is understood that the transmitting module 10 may also transmit a single-wavelength optical pulse signal to the target object 200, and after a longer distance of flight, the intensity of the corresponding second co-frequency becomes smaller accordingly.
The optical pulse signal has a preset co-frequency intensity, wherein the co-frequency intensity comprises a pulse width and a pulse wavelength.
The acquisition module 20 includes an image sensor 21 having a pixel array, and the image sensor 21 is configured to receive the light pulse signal reflected by the target object 200 and convert the light pulse signal into an electrical signal.
The conversion module 30 includes a plurality of passive elements, and the conversion module 30 is configured to convert the electrical signal to obtain the co-frequency intensity of the received optical pulse signal.
The calculating module 40 is configured to obtain a variation according to the co-frequency intensity of the transmitted optical pulse signal and the co-frequency intensity of the received optical pulse signal, and obtain the distance to the target object according to the variation.
Thus, the depth camera 100 can obtain the distance of the target object according to the co-frequency intensity change of the transmitted optical pulse signal and the received optical pulse signal by transmitting the optical pulse signal and receiving the corresponding optical pulse signal, and the measurement mode has low requirements on the measurement environment and wide applicable scenes.
In an embodiment of the present application, referring to fig. 2, the emitting module 10 includes a light source 12, a temporal light modulator 14, and a spatial light modulator 16, wherein the light source 12 is configured to emit a single-wavelength or dual-wavelength light beam, the temporal light modulator 14 is configured to provide a pulse signal with a preset co-frequency intensity to control the light source to emit a pulse light beam with a preset co-frequency intensity, and the spatial light modulator is configured to spatially modulate the pulse light beam to form a plurality of light pulse signals emitted outwards. In this manner, the distribution of the pulsed light beam in space is modulated by the temporal modulator 14 and the spatial light modulator 16 to form a non-flood carrier beam with a non-uniform intensity distribution for emission.
Compared with the traditional floodlight beam, the non-floodlight beam has uneven intensity distribution, so that the area with higher intensity distribution has higher anti-interference performance on the ambient light under the condition of the same light source power; in addition, under the condition of the same projection field angle, the non-floodlight beam needs less power consumption to achieve the same ambient light anti-interference performance due to the non-uniformity of the intensity distribution.
The light source 12 may be a light source such as a Light Emitting Diode (LED), an Edge Emitting Laser (EEL), a Vertical Cavity Surface Emitting Laser (VCSEL), or a light source array composed of a plurality of light sources, and the light beam emitted by the light source may be visible light, infrared light, ultraviolet light, or the like. The temporal light modulator 14 may be a separate control circuit or a processing circuit, for example, a processing circuit that modulates the light source to emit light pulse signals of different co-frequency intensities. The frequency of the optical pulse signal is set according to the measurement distance, and may be set to 1MHz to 100MHz, for example, and the measurement distance is several meters to several hundred meters.
Further, the processing circuit may be a stand-alone dedicated circuit, such as a dedicated SOC chip, an FPGA chip, an ASIC chip, etc., or may include a general-purpose processor, for example, when the depth camera is integrated into a smart terminal, such as a mobile phone, a television, a computer, etc., the processor in the terminal may be at least a part of the processing circuit.
Fig. 3 is a schematic diagram of the transmitting module 10 according to an embodiment of the present application. The transmission module 10 includes a light source 12, a driving circuit 13, a lens 15, and a diffractive optical element 17.
The light source 12 emits a pulse-modulated light beam under power time modulation of the drive circuit 13, the light beam is collimated or focused by the lens 15 and then enters the diffractive optical element 17, and the diffractive optical element 17 spatially modulates, i.e., diffracts, the incident light beam.
In one embodiment of the present application, the diffractive optical element 17 splits the incident light beam and emits a plurality of light beams, each of which forms a spot on the surface of the target object 200, toward the target object.
In an embodiment of the present application, the diffractive optical element 17 will form a regularly arranged array of spots by diffraction of an incident light beam.
In an embodiment of the present application, the diffractive optical element 17 forms a speckle pattern by diffracting the incident light beam, i.e. the arrangement of the spots has a certain randomness.
The light source 12 may be a single light source or an array of light sources. In one embodiment, light source 12 is a light source array composed of a regular plurality of light sources, such as a VCSEL array chip composed of a semiconductor substrate and a plurality of VCSEL light sources arranged on the substrate.
The diffractive optical element 17 replicates the array beam emitted by the light source 12 and the outwardly emitted non-flood beam is composed of a plurality of replicated array beams, whereby the number of beams can be enlarged.
In some embodiments, the spatial light modulator in the transmitting module 10 may also include a mask plate, and the mask plate includes a two-dimensional pattern for modulating the incident light beam into the non-flood light beam, for example, the incident light beam may be spatially modulated by the mask plate to form a two-dimensional encoding pattern light beam.
In some embodiments, the spatial light modulator in the emitting module 10 may also include a microlens array, and the microlens array is formed by arranging a plurality of microlens units, and in one embodiment, the plurality of microlens units receive the light beam from the light source 12 and generate an array light beam corresponding to the arrangement of the microlens units to emit outwards.
In some embodiments, the light source 12 also includes a plurality of sub-light sources corresponding to the arrangement in the microlens array, and each microlens unit receives the light beam of the corresponding sub-light source and emits the array light beam outwards after being collimated or focused. The array light beams can be arranged randomly or regularly.
It can be understood that the co-frequency intensity of the transmitted optical pulse signal can be preset, as long as the co-frequency intensity is within the acquisition frequency range of the acquisition module 20, and the co-frequency intensity of the received optical pulse signal can be acquired by the passive element, so that the measurement mode has low requirements on equipment and is easy to implement.
It can be understood that the light source 12 may emit a single-wavelength light beam or a dual-wavelength light beam, and according to the chromatic dispersion of light, after an optical pulse signal corresponding to the dual-wavelength light beam is transmitted in the air by a preset distance, the pulse width of the optical pulse signal changes, so that the resolution in distance measurement can be improved. According to the scattering of light, after the optical pulse signal corresponding to the single-wavelength light beam is transmitted for a preset distance in the air, the pulse width of the optical pulse signal changes.
Acquisition module 20 includes an image sensor 22. The image sensor 22 may be a Charge Coupled Device (CCD), a Complementary Metal Oxide Semiconductor (CMOS), an Avalanche Diode (AD), a Single Photon Avalanche Diode (SPAD), or the like.
In one embodiment of the present application, the image sensor 22 includes an array of pixels, each pixel including a plurality of taps (for storing and reading or discharging charge signals generated by incident photons under control of a corresponding electrode), such as 5 taps, for reading electrical signals.
Referring to fig. 4, which is a schematic diagram of a conversion module 30 according to the present application, the conversion module 30 is connected to the image sensor 22, the conversion module 30 sequentially includes a band-pass filter 32, an AC/DC converter 34 and a counter 36, the band-pass filter 32 is configured to filter an electrical signal outside a preset frequency, the AC/DC converter 34 is configured to convert the electrical signal into a direct current signal for subsequent reading, the AC/DC converter 34 is configured to cooperate with the band-pass filter 32 to convert the electrical signal, and the counter 36 is configured to obtain a potential level of the electrical signal and obtain an actual co-frequency intensity of the optical pulse signal according to the potential level. The band-pass filter 32, the AC/DC converter 34 and the counter 36 are all passive components, and the electrical signal converted by the image sensor 22 is sequentially input to the passive components, so as to obtain the actual co-frequency intensity of the optical pulse signal.
In one embodiment, counter 36 is a capacitor.
In one embodiment, the transforming module 30 is a fourier transform chip for fourier transforming the electrical signal.
In an embodiment of the present application, the calculation module 40 executes the following formula to obtain the distance of the target object:
ΔT=GVD*L*Δλ;
where Δ T is a pulse width difference between the transmitted optical pulse signal and the received optical pulse signal, GVD is group velocity dispersion, L is a distance of the target object, and Δ λ is a pulse wavelength difference between the transmitted optical pulse signal and the received optical pulse signal. The pulse width difference Δ T and the pulse wavelength difference Δ λ can both be obtained by the variation of the optical pulse signal.
In an embodiment of the present application, fig. 5 is a timing waveform diagram of an emitted optical pulse signal and a received optical pulse signal in an embodiment of the present application. The waveform diagram may be obtained by an oscilloscope or other equipment, and it is understood that the pulse train period of the transmitted optical pulse signal is 20ms, the first resonance frequency is 100Hz, the second resonance frequency is 200Hz, the optical pulse signal reflected by the target object is received by the image sensor 22, the intensity of the second resonance frequency of the received optical pulse signal decreases, and the second resonance frequency difference is obtained as Δ f according to the second resonance frequency of the transmitted optical pulse signal and the second resonance frequency of the received optical pulse signal. The pulse width difference Δ T can be obtained from the second resonant frequency difference Δ f, wherein the wavelength difference can be determined by the specification of the light beam emitted by the light source 12, when the specification of the light beam is preset, the wavelength difference is fixed, Δ λ can be directly obtained, the group velocity dispersion GVD is also determined by the parameter of the light source, and thus the distance of the target object can be directly obtained by using the existing parameter.
In an embodiment of the present application, a relationship between the second resonant frequency difference Δ f and the pulse width difference Δ T satisfies:
and Δ f ═ α Δ T, where α is a correction parameter, and the value of α can be determined according to parameters of the light source itself.
Referring to fig. 6, fig. 6 is a schematic flowchart illustrating a method for measuring a distance based on time of flight according to an embodiment of the present application. The order of the steps in the flow chart may be changed and some steps may be omitted according to different needs. For convenience of explanation, only portions related to the embodiments of the present application are shown.
The distance measuring method is applied to the depth camera. For the depth camera which needs to perform distance measurement, the distance measurement function provided by the method of the present application can be directly integrated on the depth camera, or a client for implementing the distance measurement method of the present application is installed. For another example, the distance measurement method provided by the present application may also be run on the depth camera in the form of a Software Development Kit (SDK), an interface for distance measurement function is provided in the form of SDK, and a processor or other device may implement the distance measurement function through the provided interface. The distance measuring method includes the following steps.
Step S1, emitting a light pulse signal to the target object.
The optical pulse signal has a preset co-frequency intensity, wherein the co-frequency intensity comprises a pulse width and a pulse wavelength.
And step S2, receiving the optical pulse signal reflected by the target object, and converting the optical pulse signal into an electric signal.
The received optical pulse signal is converted into an electric signal so as to be read by the electric signal.
And step S3, converting the electric signal to obtain the co-frequency intensity of the received optical pulse signal.
In particular, the bandpass filter is matched with an AC/DC converter to acquire the co-frequency intensity of the received optical pulse signal. It is understood that the electrical signal may also be converted by the fourier transform chip to obtain the co-frequency intensity of the received optical pulse signal.
And step S4, acquiring the variation of the co-frequency intensity of the optical pulse signal according to the co-frequency intensity of the transmitted optical pulse signal and the co-frequency intensity of the received optical pulse signal.
And step S5, acquiring the distance of the target object according to the covariance intensity variation.
Therefore, the distance of the target object is obtained according to the co-frequency intensity change of the transmitted optical pulse signals and the received optical pulse signals by transmitting the plurality of optical pulse signals and receiving the corresponding optical pulse signals, the measuring mode has low requirement on measuring environment, the applicable scene is wide, the structure is simple, and the method is easy to realize.
In an embodiment, step S1 specifically includes:
providing a pulse signal with preset co-frequency intensity so that the light source emits a pulse light beam with preset co-frequency intensity;
the pulsed light beam is spatially modulated to form a plurality of light pulse signals that are emitted outward.
Specifically, the light beam emitted by the light source is temporally and spatially modulated to form a non-flood carrier beam having a non-uniform intensity distribution for emission.
In an embodiment of the present application, step S3 specifically includes:
converting the electrical signal with an AC/DC converter through a band pass filter;
and acquiring the potential height of the electric signal through a counter, and acquiring the actual co-frequency intensity of the optical pulse signal according to the potential height.
Thus, the signal is converted to change the co-frequency intensity of the received optical pulse signal.
In an embodiment, step S5 specifically includes:
executing the following formula to obtain the distance of the target object:
ΔT=GVD*L*Δλ;
where Δ T is a pulse width difference between the transmitted optical pulse signal and the received optical pulse signal, GVD is group velocity dispersion, L is a distance of the target object, and Δ λ is a pulse wavelength difference between the transmitted optical pulse signal and the received optical pulse signal.
Fig. 6 is a schematic diagram illustrating a distance measurement method according to the present application in detail, and fig. 7 is a schematic diagram illustrating an architecture of an electronic device according to an embodiment of the present application. The functional modules and hardware device architecture for implementing the distance measurement are described below with reference to fig. 6 and 7.
The electronic device 300 comprises a memory 311, a processor 312 and a communication bus 313, wherein the memory 311 is connected with the processor 312 in a communication mode through the communication bus 313.
The electronic device 300 further comprises a computer program 314, such as a program for distance measurement, stored in the memory 311 and executable on the processor 312.
The steps of the distance measurement method in the method embodiment are implemented when the processor 312 executes the computer program 314. Alternatively, the processor 312 executes the computer program 314 to realize the functions of the modules/units in the system embodiments.
The computer program 314 may be partitioned into one or more modules/units that are stored in the memory 311 and executed by the processor 312 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, the instruction segments describing the execution process of the computer program 314 in the electronic device 300.
It will be understood by those skilled in the art that the schematic diagram 7 is merely an example of the electronic device 300 and is not intended to limit the electronic device 300, and that the electronic device 300 may include more or less components than those shown, or some components may be combined, or different components, for example, the electronic device 300 may further include an input device, etc.
The Processor 312 may be a Central Processing Unit (CPU), and may include other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field-Programmable Gate arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like, and the processor 312 is the control center of the electronic device 300 and connects the various parts of the entire electronic device 300 by various interfaces and lines.
The memory 311 may be used for storing the computer programs 314 and/or modules/units, and the processor 312 implements various functions of the electronic device 300 by running or executing the computer programs and/or modules/units stored in the memory 311 and calling data stored in the memory 311. The storage 311 may include an external storage medium and may also include a memory. In addition, the memory 311 may include a high-speed random access memory, and may also include a non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other volatile solid state storage device.
The modules/units integrated with the electronic device 300 may be stored in a computer-readable storage medium if they are implemented in the form of software functional units and sold or used as separate products. Based on such understanding, all or part of the processes in the methods of the embodiments may be implemented by a computer program, which may be stored in a computer-readable storage medium and used by a processor to implement the steps of the embodiments of the methods. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present application and not for limiting, and although the present application is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions can be made on the technical solutions of the present application without departing from the spirit and scope of the technical solutions of the present application.

Claims (10)

1. A time-of-flight based distance measurement method, the method comprising:
emitting a light pulse signal to a target object, wherein the light pulse signal has a preset co-frequency intensity, and the co-frequency intensity comprises a pulse width and a pulse wavelength;
receiving the optical pulse signal reflected back by the target object and converting the optical pulse signal into an electrical signal;
converting the electrical signal to obtain the co-frequency intensity of the received optical pulse signal;
acquiring the co-frequency intensity variation of the optical pulse signal according to the co-frequency intensity of the transmitted optical pulse signal and the co-frequency intensity of the received optical pulse signal;
and acquiring the distance of the target object according to the co-frequency intensity variation.
2. The distance measuring method according to claim 1, wherein said transmitting a plurality of light pulse signals to the target object specifically comprises:
providing a pulse signal with preset co-frequency intensity so that the light source emits a pulse light beam with preset co-frequency intensity;
the pulsed light beam is spatially modulated to form a plurality of light pulse signals that are emitted outward.
3. The distance measurement method according to claim 1, wherein said converting said electrical signal to obtain the co-frequency intensity of the received optical pulse signal comprises:
converting the electrical signal with an AC/DC converter through a band pass filter;
and acquiring the potential height of the electric signal through a counter, and acquiring the actual co-frequency intensity of the optical pulse signal according to the potential height.
4. The method according to claim 3, wherein the obtaining the distance of the target object according to the covariance intensity variation includes:
executing the following formula to obtain the distance of the target object:
ΔT=GVD*L*Δλ;
where Δ T is a pulse width difference between the transmitted optical pulse signal and the received optical pulse signal, GVD is group velocity dispersion, L is a distance of the target object, and Δ λ is a pulse wavelength difference between the transmitted optical pulse signal and the received optical pulse signal.
5. A depth camera, characterized in that the depth camera comprises:
the transmitting module is used for transmitting an optical pulse signal to a target object, wherein the optical pulse signal has preset co-frequency intensity, and the co-frequency intensity comprises a pulse width and a pulse wavelength;
an acquisition module comprising an image sensor having an array of pixels, the image sensor receiving the light pulse signals reflected back by the target object and converting the light pulse signals into electrical signals;
the conversion module comprises a plurality of passive elements and is used for converting the electric signals so as to acquire the co-frequency intensity of the received optical pulse signals;
and the calculation module is used for acquiring the variation according to the co-frequency intensity of the transmitted optical pulse signal and the co-frequency intensity of the received optical pulse signal and acquiring the distance of the target object according to the variation.
6. The depth camera of claim 5, wherein the transmit module comprises:
a light source for emitting a light beam of a single wavelength or a dual wavelength;
the time light modulator is used for providing a pulse signal with preset co-frequency intensity so as to control the light source to emit a pulse light beam with preset co-frequency intensity;
and the spatial light modulator is used for spatially modulating the pulse light beam to form an outward-emitted light pulse signal.
7. The depth camera of claim 5, wherein the conversion module comprises:
a band-pass filter;
an AC/DC converter cooperating with the band-pass filter to convert the electrical signal;
and the counter is used for acquiring the potential of the electric signal and acquiring the actual co-frequency intensity of the optical pulse signal according to the potential.
8. The depth camera of claim 5, wherein the calculation module executes the following formula to obtain the distance of the target object:
ΔT=GVD*L*Δλ;
where Δ T is a pulse width difference between the transmitted optical pulse signal and the received optical pulse signal, GVD is group velocity dispersion, L is a distance of the target object, and Δ λ is a pulse wavelength difference between the transmitted optical pulse signal and the received optical pulse signal.
9. An electronic device, characterized in that the electronic device comprises:
the device comprises a memory, a processor and a communication bus, wherein the memory is in communication connection with the processor through the communication bus; and
the memory has stored therein a plurality of program modules that are loaded by the processor and execute the distance measuring method according to any one of claims 1 to 4.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the distance measuring method according to any one of claims 1 to 4.
CN202011350400.7A 2020-11-26 2020-11-26 Distance measuring method, depth camera, electronic device, and storage medium Withdrawn CN112684466A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011350400.7A CN112684466A (en) 2020-11-26 2020-11-26 Distance measuring method, depth camera, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011350400.7A CN112684466A (en) 2020-11-26 2020-11-26 Distance measuring method, depth camera, electronic device, and storage medium

Publications (1)

Publication Number Publication Date
CN112684466A true CN112684466A (en) 2021-04-20

Family

ID=75446809

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011350400.7A Withdrawn CN112684466A (en) 2020-11-26 2020-11-26 Distance measuring method, depth camera, electronic device, and storage medium

Country Status (1)

Country Link
CN (1) CN112684466A (en)

Similar Documents

Publication Publication Date Title
WO2021008209A1 (en) Depth measurement apparatus and distance measurement method
CN111142088B (en) Light emitting unit, depth measuring device and method
CN109343070A (en) Time flight depth camera
Bronzi et al. Automotive three-dimensional vision through a single-photon counting SPAD camera
CN110221272B (en) Time flight depth camera and anti-interference distance measurement method
CN111123289B (en) Depth measuring device and measuring method
US9894347B2 (en) 3D image acquisition apparatus and method of driving the same
US20220082698A1 (en) Depth camera and multi-frequency modulation and demodulation-based noise-reduction distance measurement method
CN209167538U (en) Time flight depth camera
US8254665B2 (en) Systems for capturing three-dimensional one or more images and methods thereof
CN110361751A (en) The distance measurement method of time flight depth camera and the reduction noise of single-frequency modulation /demodulation
CN212694038U (en) TOF depth measuring device and electronic equipment
CN110221274A (en) Time flight depth camera and the distance measurement method of multifrequency modulation /demodulation
CN111708039A (en) Depth measuring device and method and electronic equipment
CN110221273A (en) Time flight depth camera and the distance measurement method of single-frequency modulation /demodulation
CN209894976U (en) Time flight depth camera and electronic equipment
US20220043129A1 (en) Time flight depth camera and multi-frequency modulation and demodulation distance measuring method
KR20140145481A (en) Tof camera for vehicle
CN110488251A (en) The preparation method of laser radar system and its laser radar echo signal curve, device
CN110596720A (en) Distance measuring system
CN112684466A (en) Distance measuring method, depth camera, electronic device, and storage medium
CN114935743B (en) Emission module, photoelectric detection device and electronic equipment
CN114935742B (en) Emission module, photoelectric detection device and electronic equipment
CN114236504A (en) dToF-based detection system and light source adjusting method thereof
CN210090674U (en) Distance measuring system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20210420