WO2023159974A1 - 测距方法、光电探测模组、芯片、电子设备及介质 - Google Patents

测距方法、光电探测模组、芯片、电子设备及介质 Download PDF

Info

Publication number
WO2023159974A1
WO2023159974A1 PCT/CN2022/125660 CN2022125660W WO2023159974A1 WO 2023159974 A1 WO2023159974 A1 WO 2023159974A1 CN 2022125660 W CN2022125660 W CN 2022125660W WO 2023159974 A1 WO2023159974 A1 WO 2023159974A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
electronic device
photoelectric
measured object
ranging
Prior art date
Application number
PCT/CN2022/125660
Other languages
English (en)
French (fr)
Inventor
谢承志
余力强
崔振威
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023159974A1 publication Critical patent/WO2023159974A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements

Definitions

  • the present application relates to the technical field of distance measurement, and in particular to a distance measurement method, a photoelectric detection module, a chip, electronic equipment and a medium.
  • triangulation ranging In the field of ranging technology, triangulation ranging or Time of Flight (ToF) ranging is generally used to measure the distance between the measured object and the device.
  • the triangulation ranging method is limited by the resolution of the photoelectric detection element in the device, and cannot guarantee the ranging accuracy in the long-distance (for example, more than 100 meters) ranging scene, while the time-of-flight ranging method is close
  • distance measurement for example, within 1 meter
  • waveform signal distortion is prone to occur, resulting in inaccurate measured flight time, which in turn affects the accuracy of short distance measurement.
  • the present application provides a ranging method, a photoelectric detection module, a chip, an electronic device and a medium.
  • an embodiment of the present application provides a ranging method, which can be applied to electronic equipment, the electronic equipment includes a first transmitter and a first receiver, and the first receiver includes a photoelectric time-of-flight ToF detection element , the photoelectric ToF detection element supports the first ranging method and the second ranging method, the method includes: using the first ranging method and/or the second ranging method to determine whether the distance between the measured object and the electronic device exceeds the first preset distance; if the distance from the measured object to the electronic device is less than the first preset distance, the distance from the measured object to the electronic device determined by the first distance measurement method will be used as the distance measurement result; or, if the measured object to the electronic device If the distance is greater than or equal to the second preset distance, the distance from the measured object to the electronic device determined by the second distance measuring method is used as the distance measurement result, wherein the second preset distance is greater than or equal to the first preset distance.
  • the above-mentioned electronic device may be an intelligent sweeping robot, a logistics service robot, etc., which is not limited in this application.
  • the photoelectric ToF detection element can be elongated, and its aspect ratio is greater than or equal to 3.
  • its aspect ratio may be 3:1.
  • the first transmitter can be a lidar.
  • the first ranging method may be triangulation ranging
  • the second ranging method may be time-of-flight ranging. It can be understood that the triangulation distance measurement method is more suitable for short-distance distance measurement, while the time-of-flight distance measurement method is more suitable for long-distance distance measurement. Therefore, the above method can decide whether to use the triangulation distance measurement method or The distance measured by the time-of-flight ranging method is used as the ranging result to ensure the accuracy of the ranging result.
  • the distance measured by the triangulation distance measurement method and/or the time-of-flight distance measurement method may be used as the judgment basis, which is not limited in the present application.
  • the distance from the measured object to the electronic device when the distance from the measured object to the electronic device is less than the first preset distance, it indicates that the current ranging scene belongs to the short-distance ranging scene, so the distance measured by the triangular ranging method is used as the ranging result , and when the distance from the measured object to the electronic device is greater than or equal to the second preset distance, it indicates that the current ranging scene does not belong to the short-range ranging scene, and the distance measured by the time-of-flight ranging method is used as the ranging result .
  • the value of the first preset distance is an empirical value or an experimental value, for example, 1 meter
  • the second preset distance is greater than or equal to the first preset distance.
  • the above method is to ensure the accuracy of the ranging result of the electronic device in the short-distance ranging scene or the long-distance ranging scene.
  • the distance between the measured object and the electronic device is between the short-distance ranging scene and the long-distance ranging scene, for example, when it is greater than 1 meter and less than 50 meters, due to the triangulation ranging method and time-of-flight measurement
  • the distance measured by the triangular distance measurement method can be used as the distance measurement result at this time, and the distance measured by the time-of-flight distance measurement method can also be used as the distance measurement result. This application does not limit this .
  • the first distance measuring method includes: after the first light emitted by the first emitter is reflected by the object to be measured, imaging on the photoelectric ToF detection element Position to determine the distance from the measured object to the electronic device. That is to say, the ranging principle of the triangular ranging method is to use the relationship between the imaging position of the light reflected by the measured object on the photoelectric ToF detection element and the actual moving distance of the measured object to calculate the distance from the measured object to the electronic device of.
  • the ranging principle of the triangular ranging method is to use the relationship between the imaging position of the light reflected by the measured object on the photoelectric ToF detection element and the actual moving distance of the measured object to calculate the distance from the measured object to the electronic device of.
  • the first distance measuring method further includes: if the distance between the object to be measured and the electronic device is less than or equal to the third preset distance, sending a signal to the first transmitter
  • the first light of the first light is subjected to spectroscopic processing, so that the first light emitted by the first emitter is divided into the first light path and the second light path, and the imaging position on the photoelectric ToF detection element after the light of the second light path is reflected by the measured object , to determine the distance from the object to be measured to the electronic device, wherein the direction of the light of the first light path is the same as that of the first light and the angle between the light of the first light path and the direction of the second light path is in the range of 5° to 10°,
  • the third preset distance is smaller than the first preset distance.
  • a spectroscopic device such as a spectroscope can be set on the first emitter to process the light emitted by the first emitter into a first light path and a second light path, and the first light path is consistent with the original outgoing direction, and the second light path is consistent with the first light path.
  • the light angle of the first optical path is 5° to 10°, so that when the measured object is less than the third preset distance from the electronic device, the photoelectric ToF detection element can still detect the imaging position of the measured object. Then when the distance from the measured object to the electronic device is less than the third preset distance, such as 0.3 meters, the imaging position of the second optical path light reflected by the measured object on the photoelectric ToF detection element can be used to calculate the distance between the measured object and the electronic device. distance.
  • the electronic device further includes a second transmitter
  • the first distance measuring method further includes: if the distance between the measured object and the electronic device is less than or equal to the third preset Setting the distance, after the light emitted by the second emitter is reflected by the object under test, the imaging position of the object under test on the photoelectric ToF detection element is used to determine the distance from the object under test to the electronic device, wherein the third preset distance is less than the first A preset distance, the included angle between the first emitter and the light emitted by the second emitter ranges from 5° to 10°.
  • the light emitted by another emitter of the electronic device reflected by the measured object can also be reflected in the photoelectric ToF detects the imaging position on the element to calculate the distance from the measured object to the electronic device.
  • the specific calculation method is the same as the principle of the triangular ranging method, and will not be repeated here.
  • the second distance measuring method includes: using the first light emitted by the first transmitter to be reflected by the measured object and then received by the first receiver Determine the distance from the measured object to the electronic device by the time elapsed in the test. That is to say, the ranging principle of the time-of-flight ranging method is to calculate the distance from the measured object to the electronic device by using the elapsed time and the speed of light emitted by the transmitter and reflected by the measured object and received by the receiver. Reference may be made to the description about the time-of-flight ranging method below, and details are not repeated here.
  • the time elapsed during the process of the light emitted by the second laser being reflected by the object under test and received by the receiver and the speed of light can also be used to calculate the distance from the object under test to
  • the distance of the electronic device is not limited in this application.
  • the embodiment of the present application provides a photoelectric detection module, the module includes a first laser radar, which is used to emit a first light to the object under test;
  • the receiver includes a photoelectric ToF detection element, and the photoelectric ToF detection element is used to receive the first light reflected by the measured object;
  • memory for storing instructions to be executed by the one or more processors of the photodetection module
  • the processor is one of the processors of the photoelectric detection module, and is used to implement the ranging method in any possible implementation method in the first aspect.
  • a spectroscopic device is provided on the first laser radar, for performing spectroscopic processing on the light emitted by the first laser radar, so that the light emitted by the first laser radar
  • the first light is divided into a first light path and a second light path, wherein the light of the first light path is in the same direction as the first light and the angle between the light of the first light path and the direction of the second light path is in the range of 5° to 10° °.
  • the photoelectric detection module further includes a second laser radar, and the angle between the first laser radar and the light emitted by the second laser radar ranges from 5° to 10° .
  • the embodiment of the present application also provides an electronic device, the electronic device includes any one of the optoelectronic modules in the second aspect above,
  • memory for storing instructions to be executed by one or more processors of the electronic device
  • the processor is one of the processors of the electronic device, and is used for the ranging method in any possible implementation manner of the first aspect above.
  • the embodiment of the present application also provides a photoelectric ToF detection chip, and the photoelectric ToF detection chip includes:
  • a processor configured to execute a computer-executable program, so that the device installed with the photoelectric ToF detection chip can implement the ranging method in any of the possible implementation methods in the first aspect above.
  • the photoelectric ToF detection chip outputs the distance from the measured object to the electronic device through a communication interface.
  • the present application also provides a distance measuring method applied to electronic equipment, wherein the electronic equipment includes a photoelectric ToF detection chip, a processor, a transmitter, and a receiver, and the receiver includes a photoelectric ToF detection element, a photoelectric ToF detection element, and a photoelectric ToF detection chip.
  • the ToF detection chip supports the first ranging method and the second ranging method;
  • the method includes: the photoelectric ToF detection chip determines the imaging position of the measured object on the photoelectric ToF detection chip based on the first distance measurement method, and the processor calculates the first distance from the measured object to the measured object to the electronic device according to the determined imaging position. distance, and the photoelectric ToF detection chip determines the time elapsed in the process of light being reflected by the measured object and received by the receiver based on the second ranging method, and the processor calculates the distance from the measured object to the electronic device according to the determined time second distance;
  • the processor takes the first distance as the ranging result, and when the first distance or the second distance is greater than or equal to the second preset distance, The second distance is used as the ranging result, wherein the second preset distance is greater than or equal to the first preset distance.
  • the first ranging method may be a triangular ranging method
  • the second ranging method may be a time-of-flight ranging method
  • the first distance is the distance measured by the triangular ranging method
  • the second distance is the distance measured by the time-of-flight ranging method. measured distance.
  • the ranging method of the present application can be realized by the cooperation of the photoelectric ToF detection chip and the processor, that is, the photoelectric ToF detection chip uses the triangulation ranging method or the time-of-flight ranging method to measure the distance from the measured object to the electronic device, and then the processing The device judges the ranging scene according to the distance, and decides to use the triangulation ranging method or the time-of-flight ranging method as the final ranging result.
  • the first distance or the second distance is used as the ranging result
  • the first distance or the second distance is greater than or equal to the second preset distance
  • taking the second distance as the ranging result wherein the second preset distance is greater than or equal to the first preset distance.
  • the embodiment of the present application also provides an electronic device, the electronic device includes a photoelectric ToF detection chip, a processor, a transmitter and a receiver, the receiver includes a photoelectric ToF detection element, and the photoelectric ToF detection chip supports the first A ranging method and a second ranging method; wherein, the photoelectric ToF detection chip can determine the imaging position of the measured object on the photoelectric ToF detection element based on the first ranging method, and can determine the light passing through The time elapsed after the measured object is reflected and received by the receiver;
  • the processor can calculate a first distance from the measured object to the electronic device according to the determined imaging position information, and can calculate a second distance from the measured object to the electronic device according to the determined time, and
  • the processor can also use the first distance as the ranging result when the first distance or the second distance is less than the first preset distance, and use the first distance or the second distance greater than or equal to the second preset distance.
  • the second distance is used as the ranging result, wherein the second preset distance is greater than or equal to the first preset distance.
  • the embodiment of the present application also provides a readable medium, and the readable storage medium stores a computer program, which is characterized in that, when the computer program is executed by the processor, any possible implementation method in the above first aspect is implemented The ranging method in .
  • an embodiment of the present application provides a computer program product, which, when the computer program product is run on an electronic device, causes the electronic device to execute the ranging method in any possible implementation method in the first aspect.
  • Fig. 1 shows the ranging principle of the triangular ranging method
  • Fig. 2 shows the ranging principle of the time-of-flight ranging method
  • Fig. 3 shows the difference between signals of waveforms received when the time-of-flight ranging method is used for different measured objects
  • FIG. 4A shows a schematic diagram of a hardware structure of an electronic device 100 to which the ranging method of the present application is applicable;
  • FIG. 4B shows a schematic structural diagram of a specific implementation of the electronic device 100, wherein the lidar 11 and the receiver 12 are arranged on a base 50, and the base 50 is coaxially arranged with the motor 30;
  • FIG. 4C shows the relative positional relationship between the laser radar 11 and the beam splitter 110 in FIG. 4B;
  • FIG. 4D shows a structural schematic diagram of another specific implementation of the electronic device 100, wherein the laser radar 11 and the receiver 12 are arranged on the base 50, the base 50 and the motor 30 are not coaxially arranged, and the motor 30 drives the base through the transmission belt 40 50 turns;
  • Figure 5 shows a schematic structural diagram of timing using a photoelectric ToF detection element in the present application
  • Fig. 6 shows the statistical histogram of the time-of-flight obtained by the device shown in Fig. 5, wherein the horizontal axis represents the time-of-flight, and the vertical axis represents the frequency of time-of-flight;
  • FIG. 7 shows a schematic flow chart of the ranging method in the present application
  • Fig. 8 shows a schematic diagram of the principle of implementing the ranging method of the present application
  • FIG. 9 shows another schematic diagram of the principle of implementing the ranging method of the present application.
  • FIG. 10 shows another schematic diagram of the principle of implementing the ranging method of the present application.
  • FIG. 11 shows another schematic diagram of the principle of implementing the ranging method of the present application.
  • Fig. 12 shows a schematic structural diagram of a photoelectric ToF detection chip system.
  • the triangulation distance measurement method is to emit a beam of laser light at a certain angle through the laser radar on the device to illuminate the measured object.
  • the laser light is reflected on the surface of the measured object, and then the reflected light is converged by the lens of the receiver, and is reflected on the photoelectric detection element. It is imaged into a spot, and then the distance between the measured object and the device is calculated according to the spot position of the measured object on the photoelectric detection element.
  • the photoelectric detection element can detect that the light spot of the measured object moves, and its displacement corresponds to the moving distance of the measured object, so the distance of the measured object can be calculated from the distance of the light spot displacement. Moving distance.
  • the photodetection element may be a charge coupled device image sensor (charge coupled device, CCD) or a complementary metal-oxide-semiconductor image sensor (complementary metal-oxide-semiconductor, COMS).
  • CCD charge coupled device
  • COMS complementary metal-oxide-semiconductor
  • the specific calculation process is as follows: as shown in Figure 1, the laser radar 11 emits a beam of laser light on the measured object 200 at P, and the light reflected by the measured object 200 is imaged on the photoelectric detection element 13' of the receiver 12'
  • the light spot is E', when the measured object moves to P1, the light reflected by the measured object 200 is imaged on the photoelectric detection element 13' and the light spot is E', the connection line between the laser radar 11 and the photoelectric detection element 13' and
  • the direction of the laser light emitted by the laser radar 11 is vertical, and B is the center of the lens g.
  • OE and OE' are the reflection of the laser light from the object 200 to the photodetection element
  • the distance from the measured object 200 to the device at P1 is
  • the distance between the measured object 200 and the laser radar 11 is getting farther and farther, the light spot formed by the measured object 200 reflected on the photoelectric detection element 13' will become smaller and smaller, and The position change of the light spot as the measured object moves becomes smaller and smaller until the resolution of the photodetection element 13 ′ is not enough to distinguish the position change of the light spot.
  • the triangulation distance measurement method has poor accuracy in long-distance distance measurement, and if it is necessary to improve the accuracy of the triangulation distance measurement method for long-distance distance measurement, or increase the distance between the lens g and the photodetection element 13' , but this will cause the size of the receiver module to become larger, and even so, it can only guarantee the ranging accuracy within 5M, or improve the resolution of the photodetection element 13', but in practical applications, a photoelectric sensor with a higher resolution is used The cost of the detection element is much higher than the cost of increasing the receiver module, and the improvement of resolution is also limited.
  • the triangulation distance measurement method mainly utilizes the light intensity information of the measured object 200 for imaging, and then calculates the distance from the measured object 200 to the device according to the moving distance of the light spot on the photodetection element 13 ′.
  • Luminous intensity (I) information refers to the intensity of light, which is the luminous flux of the light source in a solid angle in a certain direction. to the radiated power. The light intensity is easily affected by the external environment.
  • the light intensity of the light reflected by the object under test 200 will be affected, causing the object under test 200 to
  • the imaging effect on the photodetection element 13 ′ is not good, which means that the position of the light spot does not match the actual position, which further affects the final ranging result.
  • the triangular ranging method is not suitable for long-distance ranging or scenes with strong light interference, for example, it is not suitable for logistics service equipment that requires outdoor use.
  • the time-of-flight ranging method is to use the time t and the speed of light c that the laser is emitted from the laser radar 11 to the measured object 200, and then reflected by the measured object 200 to the receiver 12′ to calculate the distance from the laser radar 11 to the measured object 200.
  • Distance that is, the distance from the laser radar 11 to the measured object 200 is
  • the time-of-flight ranging method mainly uses light energy information for ranging, that is, as long as the time-of-flight ranging method detects the pulse reflected back by the measured object 200, the time of the measured object can be determined according to the time of the emitted laser pulse and the time of the reflected pulse.
  • the distance from the object 200 to the device does not need to consider the specific imaging position of the measured object on the photodetection element.
  • the photoelectric detection element suitable for the time-of-flight ranging method is generally a photoelectric ToF detection element.
  • the maximum ranging range of the time-of-flight ranging method can reach hundreds of meters, and the ranging accuracy does not vary with the The distance between the measured object 200 and the laser radar 11 increases and decreases.
  • the timer in the time-of-flight ranging method is timed according to the rising edge of the detected reflected pulse. Since the ability of different objects to reflect laser pulses is different, as shown in Figure 3, for the laser radar 11 For the circular wave pulses, the pulses reflected by some objects 200′ are square pulse waves A, while the pulses reflected by some objects 200′′ are circular wave pulses B. Since the determination rules for the rising edges of different waveforms are different, the timer is When timing according to the pulse rising edge signals of different waveforms reflected back, there will be some errors, for example, there will be a delay in timing the peak of the circular wave pulse B as the "rising edge" relative to the rising edge timing of the square pulse wave A.
  • the impact of the waveform is more obvious, that is, the above-mentioned waveform signal distortion is more likely to occur, so the impact on the timing effect of the timer is also more obvious, which in turn leads to the accuracy of the time-of-flight ranging method in the short-distance ranging scene
  • the above-mentioned pulse waveform emitted by the laser radar 11 and the reflected pulse waveform received by the receiver 12' are only exemplary and do not constitute a limitation to the present application.
  • the laser The radar 11 can also transmit square pulse waves or pulse waves of other waveforms, which is not limited in this application.
  • the time-of-flight ranging method is not suitable for short-distance ranging.
  • it is not suitable for applications such as household sweeping robots and other devices that are often used on wall extensions or small target obstacle avoidance.
  • the present application provides a ranging method.
  • the ranging method of the present application according to the distance between the measured object and the device, it is decided to use the distance between the measured object and the device measured by the triangulation ranging method or the time-of-flight ranging method as the ranging result.
  • the electronic device can use triangulation and/or time-of-flight ranging to continuously determine the distance from the measured object to the electronic device. time), the electronic device uses the time-of-flight ranging method that is more suitable for long-distance ranging scenarios to measure the distance as the ranging result.
  • the electronic device adopts the distance measured by the triangulation ranging method, which is more suitable for short-distance ranging scenarios, as the ranging result, so as to ensure that the ranging results of the electronic device are relatively accurate no matter whether it is short-distance ranging or long-distance ranging.
  • the current ranging scene is a short-distance ranging scene or a long-distance ranging scene by setting the first preset distance, for example, when the distance between the measured object and the electronic device is less than the first preset distance , is a short-distance ranging scene, and when the distance from the measured object to the electronic device is greater than or equal to the second preset distance, it is a long-distance ranging scene.
  • the setting of the first preset distance may be an empirical or experimental value, such as 1 meter
  • the setting of the second preset distance may also be an empirical or experimental value, such as 100 meters.
  • the second preset distance may also be equal to the first preset distance, that is, when the distance from the measured object to the electronic device is greater than or equal to the first preset distance, it is a long-distance ranging scenario.
  • the above-mentioned electronic device 100 may be an intelligent sweeping robot, a logistics service robot, an autonomous vehicle, etc., and this application does not impose any limitation on the type of the electronic device 100 .
  • FIG. 4A shows a schematic diagram of a hardware structure of an electronic device 100 capable of implementing the ranging method of the present application.
  • the hardware structure of the electronic device 100 includes a processor 10 , a memory 20 , a power module 30 , a laser radar 11 , a receiver 12 , and a timer 14 .
  • the receiver 12 includes a photoelectric ToF detection element 13, the receiver 12 is used to receive the laser light reflected back by the measured object 200, and the laser light will be imaged on the photoelectric ToF detection element 13, and the photoelectric ToF detection element 13 can reflect the laser light
  • the imaging position information is output to the processor 10 so that the processor 10 can determine the distance from the measured object 200 to the electronic device 100 according to the reflected laser imaging position information.
  • the photoelectric ToF detection element 13 also transmits the received electric signal of the reflected laser light to the timer 14, so that the timer 14 records the flight time from the measured object 200 to the electronic device 100, which will be described below in conjunction with the timer 14. introduce.
  • the photoelectric ToF detection element 13 may be an M ⁇ N two-dimensional array composed of single photon avalanche diodes (single photon avalanche diode, SPAD), that is, the SPAD array 131 as shown in FIG. 5 .
  • SPAD single photon avalanche diode
  • the benefit of using a SPAD is that it is more sensitive than other photodiodes. Specifically, when the operating voltage of the SPAD increases and its internal electric field continues to increase to a certain critical value, the electron-hole pairs generated after absorbing photons can collide to generate new electron-hole pairs, and this chain reaction is like an avalanche.
  • the photon signal can be amplified to make the SPAD work in Geiger mode, that is, as long as a photon is received by the receiver 12 and generates electrons, a large current/voltage pulse signal will be generated to represent the occurrence of the detection event.
  • the photoelectric ToF detection element 13 may also be an array composed of other photonic diodes, which is not limited in the present application.
  • the photoelectric ToF detection element 13 may be strip-shaped, and its aspect ratio may be 4:3, 2:1 or 3:1, which is not limited in the present application.
  • the storage 20 may be an internal storage unit in the electronic device 100 , such as a hard disk or memory of the robot 100 .
  • the memory 20 can also be an external storage device, such as a plug-in hard disk equipped on the robot, a smart media card (smart media card, SMC), a secure digital (secure digital, SD) card, a flash memory card (Flash Card) and the like.
  • the memory 20 may also include both an internal storage unit and an external storage device.
  • the memory 20 is used to store the computer program 21 and other data and programs required by the system.
  • the memory 20 can also be used to temporarily store data that has been output or will be output.
  • the computer program 21 includes computer program code, which may be in the form of source code, object code, executable file or some intermediate form and the like.
  • the computer-readable storage medium may include any entity or device capable of carrying computer program code, recording medium, U disk, removable hard disk, magnetic disk, optical disk, computer memory, read-only memory (read-only memory, ROM), random access Memory (random access memory, RAM), electrical carrier signal, telecommunication signal and software distribution medium, etc.
  • the processor 10 can execute the computer program 21 stored in the memory 20 to implement the ranging method provided in this application.
  • the computer program 21 can be divided into one or more modules/units, one or more modules/units are stored in the memory 20, and executed by the processor 10, and the method of each embodiment of the present application hereinafter .
  • One or more modules/units may be a series of computer program instruction segments capable of accomplishing specific functions, and the instruction segments are used to describe the execution process of the computer program 21 in the robot.
  • the processor 10 can be used to calculate the distance from the measured object 200 to the electronic device according to the imaging position of the measured object on the photoelectric ToF detection element 13, or calculate the distance from the measured object 200 to the electronic device according to the reflection 200 of the measured object received by the receiver 12.
  • the time of the returned pulse and the time of the laser pulse emitted by the laser radar 11 determine the distance from the measured object 200 to the electronic device.
  • the processor 10 can be a central processing unit (central processing unit, CPU), and can also be other general-purpose processors, digital signal processors (digital signal processor, DSP), application specific integrated circuits (application specific integrated circuits, ASICs), on-site Edit gate array (field-programmable gate array, FPGA) or other programmable logic devices, discrete gate or vigilance management logic devices, discrete hardware components, etc.
  • a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
  • the processor 10 can also control the movement of the electronic device 100 or the steering of the lidar 11 according to the measured distance between the measured object and the device, such as controlling the steering gear of the electronic device 100 to move left, right, forward, and backward Rotate, or control the above-mentioned laser radar 11 to rotate according to a certain frequency.
  • the electronic device 100 may also include a controller (not shown in the figure), and then the controller controls the movement of the electronic device 100 or the steering of the laser radar 11 according to the instructions of the processor 10. No limit.
  • the laser radar 11 is used to collect real-time information on the current scene to obtain laser scanning information of the current scene. information, etc.) for analysis and processing, for example, the processor 10 compares the laser scanning information with the preset scene information stored in the memory 20, determines whether there is an obstacle in the current scene, and determines the progress of the electronic device 100 according to the judgment result. The route is adjusted to avoid accidents such as collisions.
  • the lidar 12 can also be used in conjunction with a 3D visual map for device positioning, for example, for an intelligent sweeping robot. Specifically, the electronic device 100 can compare the obstacle outline information scanned by the lidar 12 with the obstacle information in the 3D visual map, so as to avoid obstacles more accurately.
  • the laser radar 11 is used to emit laser pulses at a certain emission angle.
  • the laser pulse emitted by the laser radar 11 may be an infrared laser pulse with a wavelength ranging from 780 nm to 940 nm, for example, an infrared laser pulse with a wavelength of 905 nm.
  • the device may have multiple laser radars, such as the laser radar 11 and the laser radar 11', and the angle between the lasers emitted by the two is between 4° and 15°, such as 5° or 10°. °.
  • the purpose of this is to avoid ranging blind spots when the triangulation ranging method is used in short-distance ranging scenarios, that is, due to the short distance between the measured object and the device, the imaging position of the measured object exceeds the photoelectric ToF detection Detectable range of element 13. This will be introduced below.
  • a beam splitter 110 can also be directly added to the existing laser radar 11 to perform spectroscopic processing on the laser light emitted by the laser radar 11 to obtain the laser light of the first optical path and the laser light of the second optical path, wherein the laser emission angle of the first optical path is consistent with the emission angle of the laser radar 11 without spectroscopic processing, and the angle range ⁇ between the laser light of the second optical path and the laser light of the first optical path is 4° to 15° , such as 5° or 10°.
  • This method can also avoid the ranging blind area that occurs when the triangulation ranging method is used in the short-distance ranging scene. This will be introduced below.
  • the laser radar 11 and the receiver 12 are fixed on the base 50, the base 50 is arranged coaxially with the motor 30, the processor 11 controls the rotation of the motor 30 and drives the base 50 to rotate synchronously, and then Drive the laser radar 11 and the receiver 12 to rotate to obtain a scanning field of view (Field of view, FoV) of 360°.
  • the base 50 of the laser radar 11 and the receiver 12 can also be arranged coaxially with the motor 30 , and then the base 50 is driven by the motor 30 to rotate by means of the transmission belt 40 , thereby driving The laser radar 11 and the receiver 12 rotate, which is not limited in this application.
  • the above-mentioned processor 10, memory 20, laser radar 11 and receiver 12 may constitute a photoelectric detection module for detecting the distance of the measured object.
  • a timer (time-to-digital converter, TDC) 14 is also called a time-to-digital conversion circuit. In the implementation manner of the present application, it is used to record the flight time required for distance measurement using the time-of-flight ranging method. Specifically, when the laser radar 11 emits laser light, the drive circuit (not shown in the figure) in the device will generate a trigger signal, and the trigger signal will be used as the starting time point t start of the time-of-flight measurement.
  • the SPAD array 13 will generate a voltage pulse signal (such as the above-mentioned square wave pulse A signal or circular wave pulse B signal) to characterize the time point t_stop of the time-of-flight detection completion, and then output the voltage pulse signal to the timer 14, which is recorded by the timer 14 (t_stop-t_start) and converted into a binary digital signal .
  • a voltage pulse signal such as the above-mentioned square wave pulse A signal or circular wave pulse B signal
  • the SPAD array 13 may be used to count multiple times of flight, and then the time of flight with the highest occurrence frequency is used as the final time of flight. It can be understood that at a certain moment, the more the number of SPAD 131 that detects the laser light reflected back by the measured object in the SPAD array 13, the stronger the energy of the light reflected by the measured object at this moment, correspondingly, this moment corresponds to The frequency of the time of flight will be higher (because multiple SPADs 131 detect the reflected light at the same time), that is, the time of flight corresponding to this moment is most likely to be the time experienced by the laser from being emitted and reflected back to the receiver 12 by the measured object.
  • Fig. 6 shows the time-of-flight statistical histogram of the SPAD array 13, the horizontal axis represents the time-of-flight, and the vertical axis represents the frequency at which this time-of-flight occurs.
  • the frequency of the time-of-flight t12 in Fig. 6 is the highest, so the laser
  • the time of flight from emission to reflection back to receiver 12 is t11.
  • other statistical methods may also be used to determine the flight time, which is not limited in this application.
  • the power module 30 may include a power supply, power management components, and the like.
  • the power source can be a battery.
  • the power management component includes a charge management module and a power management module.
  • the charging management module is used to receive charging input from the charger; the power management module is used to connect the power supply, the charging management module and the processor 10 .
  • the power management module receives the input of the power supply and/or the charging management module, and supplies power for the processor 10, the laser radar 11 and the like.
  • FIG. 4A is only an example of the hardware structure of the electronic device 100, and does not constitute a limitation on the hardware structure of the electronic device 100.
  • Components may also include input and output devices, network access devices, buses, communication modules, and so on.
  • the ranging method shown in FIG. 7 is executed by the processor of the above-mentioned electronic device 100 .
  • the method includes:
  • the first distance measuring method and/or the second distance measuring method uses the first distance measuring method and/or the second distance measuring method to determine whether the distance from the measured object 200 to the electronic device 100 exceeds a first preset distance. The purpose of doing this is to determine whether the distance between the measured object 200 and the electronic device 100 belongs to a short-distance ranging scene or a long-distance ranging scene, so as to determine which distance measurement method to use as the measured distance according to the specific ranging scene. The final ranging result.
  • the first preset distance is an empirical value or a test value, and the value of the first preset distance can be, for example, 1 meter, that is, the distance between the measured object 200 and the electronic device 100 is less than 1 meter.
  • the scene belongs to the short-distance ranging scene, and the ranging scene in which the distance between the measured object 200 and the electronic device 100 is greater than 1 meter belongs to the long-distance ranging scene.
  • the first ranging method may be a triangulation ranging method
  • the second ranging method may be a time-of-flight ranging method.
  • the electronic device 100 may determine the distance measured by the triangular ranging method that is more suitable for the short-distance ranging scene as the ranging result according to the requirements. Or use the time-of-flight ranging method that is more suitable for long-distance ranging scenarios.
  • short-distance ranging scenes and long-distance ranging scenes are not determined by only a distance value. It is often a distance that distinguishes short-distance ranging scenes and long-distance ranging scenes.
  • Range for example, when it is less than the minimum value of this distance range, it belongs to the short-distance ranging scene; when it is greater than or equal to the maximum value of this distance range, it belongs to the long-distance ranging scene; between the two, two kinds of ranging There is little difference in the accuracy of the methods, so in order to save power consumption of the electronic device 100, in this case, the ranging method may not be switched, or the measured object determined by the first ranging method and the second ranging method may be output to the electronic device. Any one of the distances of the device is used as the ranging result, which is not limited in this application.
  • the triangulation ranging method can be used to judge the distance from the measured object 200 to the electronic device 100, and the time-of-flight ranging method can also be used to judge the measured object 200 to the electronic device 100.
  • the distance between the measured object 200 and the electronic device 100 can also be determined in combination with the triangulation ranging method and the time-of-flight ranging method, which is not limited in the present application.
  • the imaging position of the measured object 200 It exceeds the detectable range of the photodetection element 13', that is, in this case, the electronic device cannot judge the distance from the measured object 200 to the electronic device 100 by using the triangular ranging method, which is the so-called "distance measurement blind zone". .
  • the "blind area of distance measurement” is shown in the scheme of this application, that is, as shown in Figure 8, when the measured object 200 is at P4, under the light emitted by the laser radar 11 along the optical path 1, the position of its imaging spot 2 will exceed the photoelectric ToF
  • the detectable range of the detection element 13 makes it impossible for the electronic device 100 to measure the distance from the measured object 200 to the electronic device 100 by triangulation.
  • the electronic device 200 can activate the second laser radar 11' provided on it when it detects that the distance between the measured object 200 and the electronic device 100 is less than or equal to the third preset distance and use the second The laser radar 11 ′ detects the measured object 200 , wherein the light emitted by the second laser radar 11 ′ can ensure that the imaging position of the measured object 200 will not exceed the detectable range of the photodetection element 13 when the distance is less than or equal to the third preset distance.
  • the third preset distance is smaller than the above-mentioned first preset distance, and its value may also be an empirical value or an experimental value, for example, the value of the third preset distance may be 0.3 meters.
  • the angle of the light emitted by the second laser radar can be designed so that the imaging position of the measured object 200 will not exceed the detectable range of the photoelectric ToF detection element 13 when it is less than or equal to the third preset distance, for example
  • the angle of the light emitted by the second laser radar can be designed to have an included angle of 5° to 10° with the angle of the light emitted by the first laser radar.
  • the angle ⁇ between the light path 3 of the light emitted by the second laser radar and the light path 1 of the light emitted by the first laser radar is 10°.
  • the beam splitter 110 can also be directly installed on the first laser radar 11, for The light emitted by the laser radar 11 is subjected to spectroscopic processing to obtain two different paths of light, wherein the first path of light maintains the original outgoing direction, and the angle between the second path of light and the original outgoing direction ranges from 5° to 10°, as shown in Fig.
  • the optical path 4 emitted by the first laser radar is subjected to the spectroscopic processing to obtain the optical path 41 and the optical path 42, and the angle ⁇ between the optical path 41 and the optical path 42 is 10°.
  • the measured object 200 can be imaged on the photoelectric ToF detection element 13 under the light of the second optical path, and then the photoelectric ToF detection element 13 detects the distance of the measured object 200
  • the position of the imaging spot is used to determine whether the distance between the measured object 200 and the electronic device 100 exceeds the first preset distance by using triangulation distance measurement according to the position of the imaging spot of the measured object 200 .
  • the distance from the measured object 200 to the electronic device 100 is less than the first preset distance, use the distance from the measured object 200 to the electronic device 100 determined by the first distance measuring method as a distance measurement result. That is, within the first preset distance, the distance from the measured object 200 to the electronic device 100 measured by the first distance measuring method is more accurate than the distance from the measured object 200 to the electronic device 100 measured by the second distance measuring method. Therefore, the distance from the measured object 200 to the electronic device 100 measured by the first distance measurement method is used as the final distance measurement result.
  • the distance between the measured object 200 and the electronic device 100 detected by the triangulation ranging method is 0.789 meters
  • the distance between the measured object 200 and the electronic device 100 detected by the time-of-flight ranging method is 0.8 meters, which is less than the above-mentioned first A preset distance of 1 meter, but the distance between the measured object 200 and the electronic device 100 measured by the triangulation distance measurement method is obviously higher than the accuracy of the distance measured by the time-of-flight distance measurement method from the measured object 200 to the electronic device 100 , so in this case, the distance from the measured object 200 to the electronic device 100 detected by the triangulation ranging method is used as the ranging result.
  • the principle of distance measurement by triangulation can be as shown in FIG. and imaged on the above-mentioned photoelectric ToF detection element 13, and then the electronic device 100 can determine the distance between the measured object 200 and the electronic device 100 by using the triangulation distance measurement method shown in FIG. The distance of the device 100.
  • the distance from the measured object 200 to the electronic device 100 is greater than or equal to the second preset distance, use the distance from the measured object 200 to the electronic device 100 determined by the first distance measuring method as a distance measurement result.
  • the second preset distance is greater than or equal to the first preset distance. That is, in the case of exceeding the first preset distance, the accuracy of the distance from the measured object 200 to the electronic device 100 measured by the second distance measuring method is better than that of the measured object 200 to the electronic device measured by the first distance measuring method.
  • the distance of 100 has high accuracy, so the distance from the measured object 200 to the electronic device 100 measured by the second distance measuring method is used as the final distance measurement result.
  • the distance between the measured object 200 and the electronic device 100 detected by the triangulation ranging method is 59 meters
  • the distance between the measured object 200 and the electronic device 100 detected by the time-of-flight ranging method is 58.799 meters, which is greater than the above-mentioned first A preset distance of 1 meter
  • the distance between the measured object 200 and the electronic device 100 measured by the time-of-flight distance measurement method is obviously higher than the accuracy of the distance measured by the triangulation distance measurement method from the measured object 200 to the electronic device 100 , so in this case, the distance from the measured object 200 to the electronic device 100 detected by the time-of-flight ranging method is used as the ranging result.
  • Fig. 12 shows a schematic diagram of a photoelectric ToF detection chip system 1200 according to some embodiments of the present application.
  • the photoelectric ToF detection chip 1200 can be applied to the electronic device 100 shown in FIG. 4A above, so as to implement the distance measuring method in the above-mentioned implementations of the present application.
  • the photoelectric ToF detection chip system 1200 can include one or more processors 1210, system memory 1202, non-volatile memory (Non-Volatile Memory, NVM) 1203, input/output (I/O) equipment 1204 , communication interface 1205 and system control logic 1206 for coupling processor 1210 , system memory 1202 , nonvolatile memory 1203 , communication interface 1204 and input/output (I/O) device 1205 .
  • the processor 1210 may include one or more single-core or multi-core processors.
  • processor 1210 may include any combination of general-purpose processors and special-purpose processors (eg, graphics processors, application processors, baseband processors, etc.).
  • the processor 1210 may be used to implement the ranging method provided in the embodiment shown in FIG. 7 .
  • the system memory 1202 is a volatile memory, such as random access memory (Random-Access Memory, RAM), double data rate synchronous dynamic random access memory (Double Data Rate Synchronous Dynamic Random Access Memory, DDR SDRAM), etc.
  • RAM Random-Access Memory
  • DDR SDRAM Double Data Rate Synchronous Dynamic Random Access Memory
  • the system memory is used to temporarily store data and/or instructions.
  • the system memory 1202 may be used to store the above-mentioned executable program for implementing the ranging method.
  • Non-volatile memory 12012 may include one or more tangible, non-transitory computer-readable media for storing data and/or instructions.
  • the nonvolatile memory 1203 may include any suitable nonvolatile memory such as flash memory and/or any suitable nonvolatile storage device, such as a hard disk drive (Hard Disk Drive, HDD), an optical disc ( Compact Disc (CD), Digital Versatile Disc (DVD), Solid-State Drive (SSD), etc.
  • the non-volatile memory 12012 may also be a removable storage medium, such as a secure digital (Secure Digital, SD) memory card and the like.
  • Secure Digital Secure Digital
  • system memory 1202 and non-volatile storage 12012 may include, respectively, temporary and permanent copies of instructions 1207 .
  • the instruction 1207 may include: when executed by at least one of the processors 1210 , the electronic device 1001200 implements the ranging method provided by various embodiments of the present application.
  • I/O devices 1204 may include a user interface enabling a user to interact with electronic device 1001200 .
  • the input/output (I/O) device 1204 may include output devices such as a monitor for displaying the interface of the insurance management system in the electronic device 1001200, and may also include input devices such as a keyboard, mouse, and touch screen.
  • Product developers can interact with the electronic device 1001200 through a user interface and input devices such as a keyboard, a mouse, and a touch screen.
  • the communication interface 1205 is configured to provide a wired or wireless communication interface for the electronic device 1001200, and then communicate with any other suitable device through one or more networks.
  • the communication interface 1205 can be integrated with other components of the photoelectric ToF detection chip system 120 , for example, the communication interface 1205 can be integrated in the processor 1210 .
  • the electronic device 1001200 can communicate with other devices through the communication interface 1205 .
  • the photoelectric ToF detection chip system 1200 outputs the distance from the measured object 200 to the device measured by the above-mentioned distance measuring method through the communication interface 1205 .
  • the system control logic 1206 may include any suitable interface controller to provide any suitable interface with other modules of the electronic device 1001200.
  • system control logic 1206 may include one or more memory controllers to provide an interface to system memory 1202 and non-volatile memory 12012 .
  • At least one of the processors 1210 may be packaged with logic for one or more controllers of the system control logic 1206 to form a System in Package (SiP). In some other embodiments, at least one of the processors 1210 can also be integrated on the same chip with the logic of one or more controllers for the system control logic 1206 to form a System-on-Chip (SoC) ).
  • SoC System-on-Chip
  • the structure of the chip system shown in the embodiment of the present application does not constitute a specific limitation on the chip system.
  • the system-on-a-chip may include more or fewer components than shown, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the present application also provides a ranging method, which can be applied to electronic equipment, and the electronic equipment includes a photoelectric ToF detection chip, a processor, a laser radar, and a receiver, wherein the receiver also includes the The photoelectric ToF detection element can also support the above two ranging methods.
  • the method includes:
  • the photoelectric ToF detection chip determines the imaging position of the measured object on the photoelectric ToF detection chip based on the first ranging method, and the processor calculates the first distance from the measured object to the measured object to the electronic device according to the determined imaging position, and
  • the photoelectric ToF detection chip determines the time elapsed in the process of light reflected by the object under test and received by the receiver based on the second distance measurement method, and the processor calculates the second distance from the object under test to the electronic device based on the determined time .
  • the processor takes the first distance as the ranging result, and when the first distance or the second distance is greater than or equal to the second preset distance, The second distance is used as the ranging result, wherein the second preset distance is greater than or equal to the first preset distance.
  • the first ranging method may be the triangular ranging method
  • the second ranging method may be the time-of-flight ranging method.
  • the first distance is the measured distance using the triangular ranging method.
  • the distance from the object to the electronic device, the second distance is the distance from the measured object to the electronic device measured by the time-of-flight ranging method.
  • the photoelectric ToF detection chip When the photoelectric ToF detection chip detects the distance from the measured object to the electronic device using the triangulation ranging method or the time-of-flight ranging method, the photoelectric ToF detection chip sends the distance to the processor of the electronic device, and the processing of the electronic device The device determines whether the current ranging scene belongs to a short-distance ranging scene or a long-distance ranging scene according to the received distance. If the first distance or the second distance is less than the first preset distance, it indicates that you are currently in a short-distance ranging scene, and the first distance measured by the triangulation method is used as the ranging result.
  • the second distance measured by the time-of-flight ranging method is used as the ranging result to ensure that the electronic device is in a short-distance ranging scene or a long-distance ranging
  • the accuracy of ranging results in the scene is used.
  • the present application also provides another electronic device, wherein, for the same description as the above distance measuring method, reference may be made to the relevant description above, and details will not be repeated below.
  • the electronic device includes a photoelectric ToF detection chip, a processor, a laser radar, and a receiver, wherein the receiver includes a photoelectric ToF detection chip, and the photoelectric ToF detection chip can support the first ranging method and the second ranging method.
  • the photoelectric ToF detection chip can determine the imaging position of the measured object on the photoelectric ToF detection element based on the first ranging method, and can determine that the light is reflected by the measured object and received by the receiver based on the second ranging method. the time elapsed in the process of arrival;
  • the processor can calculate a first distance from the measured object to the electronic device according to the determined imaging position information, and can calculate a second distance from the measured object to the electronic device according to the determined time, and
  • the processor can also use the first distance as the ranging result when the first distance or the second distance is less than the first preset distance, and use the first distance or the second distance greater than or equal to the second preset distance.
  • the second distance is used as the ranging result, wherein the second preset distance is greater than or equal to the first preset distance.
  • an embodiment of the present application also provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the steps in each of the foregoing method embodiments can be realized.
  • the embodiment of the present application provides a computer program product.
  • the computer program product runs on the electronic device 100, the electronic device 100 can realize the steps in the above-mentioned various method embodiments when executed.
  • the embodiment of the present application also provides an electronic device 100, the electronic device 100 includes: a photoelectric detection module, at least one processor, a memory, and a computer program stored in the memory and operable on the at least one processor, processing The steps in any of the above method embodiments are implemented when the computer executes the computer program.
  • the photoelectric detection module itself may include processors, memory and other devices, that is, the distance measuring method may be implemented by the processor and memory of the photoelectric detection module, or sent to the electronic device by the photoelectric detection module. implemented by the processor of the device, which is not limited in this application.
  • Embodiments of the mechanisms disclosed in this application may be implemented in hardware, software, firmware, or a combination of these implementation methods.
  • Embodiments of the present application may be implemented as a computer program or program code executed on a programmable system comprising at least one processor, a storage system (including volatile and non-volatile memory and/or storage elements) , at least one input device, and at least one output device.
  • Program code can be applied to input instructions to perform the functions described herein and to generate output information.
  • the output information may be applied to one or more output devices in known manner.
  • a processing system includes any computer having a processor such as, for example, a Digital Signal Processor (DSP), a microcontroller, an Application Specific Integrated Circuit (ASIC), or a microprocessor. system.
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • the program code can be implemented in a high-level procedural language or an object-oriented programming language to communicate with the processing system.
  • Program code can also be implemented in assembly or machine language, if desired.
  • the mechanisms described in this application are not limited in scope to any particular programming language. In either case, the language may be a compiled or interpreted language.
  • the disclosed embodiments may be implemented in hardware, firmware, software, or any combination thereof.
  • the disclosed embodiments can also be implemented as instructions carried by or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which can be executed by one or more processors read and execute.
  • instructions may be distributed over a network or via other computer-readable media.
  • a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), including, but not limited to, floppy disks, optical disks, optical disks, read-only memories (CD-ROMs), magnetic CD-ROM, Read Only Memory (ROM), Random Access Memory (Random Access Memory, RAM), Erasable Programmable Read Only Memory (EPROM), Electrically Erasable Programmable Only Memory Read memory (Electrically Erasable Programmable Read-Only Memory, EEPROM), magnetic card or optical card, flash memory, or use the Internet to transmit information by means of electricity, light, sound or other forms of propagation signals (for example, carrier waves, infrared signals, digital signals etc.) tangible machine-readable storage.
  • a machine-readable medium includes any type of machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (eg, a computer).
  • each unit/module mentioned in each device embodiment of this application is a logical unit/module.
  • a logical unit/module can be a physical unit/module, or a physical unit/module.
  • a part of the module can also be realized with a combination of multiple physical units/modules, the physical implementation of these logical units/modules is not the most important, the combination of functions realized by these logical units/modules is the solution The key to the technical issues raised.
  • the above-mentioned device embodiments of this application do not introduce units/modules that are not closely related to solving the technical problems proposed by this application, which does not mean that the above-mentioned device embodiments do not exist other units/modules.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

一种测距方法、光电探测模组、芯片、电子设备及介质。测距方法可应用在包括能够支持三角测距法和飞行时间测距法的光电飞行时间ToF探测元件(13)上的电子设备(100)上,由电子设备(100)利用三角测距法和/飞行时间测距法判断当前测距场景是远距离测距场景还是近距离测距场景,如果是近距离测距场景,那么采用更适用近距离测距的三角测距法测得结果作为最终测距结果,如果是远距离测距场景,则采用更适用远距离测距的飞行时间测距法测得的结果作为最终测距结果。通过这样的方法,可保证电子设备(100)在近距离测距场景或者远距离测距场景下的测距精确度,扩大电子设备(100)的适用场景。

Description

测距方法、光电探测模组、芯片、电子设备及介质
本申请要求于2022年02月24日提交中国专利局、申请号为202210175433.5、申请名称为“测距方法、光电探测模组、芯片、电子设备及介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及测距技术领域,尤其涉及一种测距方法、光电探测模组、芯片、电子设备及介质。
背景技术
在测距技术领域,一般采用三角测距法或飞行时间(Time of Flight,ToF)测距法测量被测物体与设备之间距离。但在实际应用中,三角测距法受限于设备中光电探测元件的分辨率,无法在远距离(例如100米以上)测距场景下保证测距精确度,而飞行时间测距法在近距离(例如1米以内)的测距情况下,容易发生波形信号失真,导致测得的飞行时间不够准确,进而影响近距离测距精确度。
如何保证测距方法在近距离测距和远距离测距的精确度,以同时适用近距离测距场景和远距离测距场景,成为亟待解决的问题。
发明内容
为了解决上述问题,本申请提供一种测距方法、光电探测模组、芯片、电子设备及介质。
第一方面,本申请实施例提供了一种测距方法,该测距方法可应用于电子设备,电子设备包括第一发射器和第一接收器,第一接收器包括光电飞行时间ToF探测元件,光电ToF探测元件支持第一测距方法和第二测距方法,方法包括:利用第一测距方法和/或第二测距方法判断被测物体至电子设备的距离是否超出第一预设距离;若被测物体至电子设备的距离小于第一预设距离,则将采用第一测距方法确定的被测物体至电子设备的距离作为测距结果;或者,若被测物体至电子设备的距离大于或等于第二预设距离,则将采用第二测距方法确定的被测物体至电子设备的距离作为测距结果,其中,第二预设距离大于或等于第一预设距离。
在一些实现方式中,上述电子设备可以是智能扫地机器人、物流服务类机器人等等,本申请对此不作限制。在一些实现方式中,光电ToF探测元件可以长条形,并且其长宽比大于等于3。例如其长宽比可以为3:1。在一些实现方式中,第一发射器可以为激光雷达。
在一些实现方式中,第一测距方法可以是三角测距法,第二测距方法可以为飞行时间测距法。可以理解,三角测距法更适用于近距离测距,而飞行时间测距法更适用于远距离测距,因此上述方法可根据被测物体至电子设备的距离,决定使用三角测距法还是飞行时间测距法测得的距离作为测距结果,以保证测距结果的精确度。
具体地,在判断被测物体至电子设备的距离的时候,则可以采用三角测距法和/或飞行时间测距法测得的距离作为判断依据,本申请对此不作限制。
在一些实现方式中,当被测物体至电子设备的距离小于第一预设距离时,则表明当前测距场 景属于近距离测距场景,因此采用三角测距法测得的距离作为测距结果,而当被测物体至电子设备的距离大于或等于第二预设距离时,则表明当前测距场景不属于近距离测距场景,则采用飞行时间测距法测得的距离作为测距结果。其中,第一预设距离的取值为经验值或实验值,例如可以为1米,第二预设距离大于或等于第一预设距离。
可以理解,上述方法是为了保证电子设备在近距离测距场景或远距离测距场景下测距结果的精确度。在实际应用中,当被测物体至电子设备的距离处于近距离测距场景和远距离测距场景之间时,例如大于1米小于50米时,由于此时三角测距法和飞行时间测距法的精确度区别不大,故此时即可采用三角测距法测得的距离作为测距结果,也可以采用飞行时间测距法测得的距离作为测距结果,本申请对此不作限制。
结合第一方面,在第一方面的一种可能的实现方式中,第一测距方法包括:利用第一发射器出射的第一光线被被测物体反射后,在光电ToF探测元件上的成像位置,确定被测物体至电子设备的距离。也即,三角测距法的测距原理是利用被测物体反射后的光线在光电ToF探测元件上的成像位置与被测物体实际移动距离之间的关系,计算被测物体至电子设备的距离的。具体可参考下文具体实施例部分关于三角测距法原理的相关描述,此处不再赘述。
结合第一方面,在第一方面的一种可能的实现方式中,第一测距方法还包括:若被测物体至电子设备的距离小于或等于第三预设距离,对第一发射器出射的第一光线进行分光处理,使第一发射器出射的第一光线分为第一光路和第二光路,利用第二光路的光线被被测物体反射后,在光电ToF探测元件上的成像位置,确定被测物体至电子设备的距离,其中,第一光路的光线与第一光线的方向相同且第一光路的光线与第二光路的方向之间的夹角范围为5°至10°,第三预设距离小于第一预设距离。
可以理解,在使用三角测距法测距时,当被测物体至电子设备距离过近,例如被测物体至电子设备距离为0.2米时,可能出现测距盲区,即光电ToF探测元件无法检测到被测物体的成像位置,进而导致无法利用三角测距法检测被测物体至电子设备的距离。因此,可在第一发射器上设置分光装置例如分光镜,将第一发射器出射的光线处理成第一光路和第二光路,并且第一光路与原出射方向保持一致,第二光路与第一光路的光线夹角在5°至10°,以使被测物体在距离电子设备小于第三预设距离时,光电ToF探测元件依然可以检测到的被测物体的成像位置。然后当被测物体至电子设备的距离小于第三预设距离例如0.3米时,即可利用被测物体反射的第二光路光线在光电ToF探测元件上的成像位置,计算被测物体至电子设备的距离。
结合第一方面,在第一方面的一种可能的实现方式中,电子设备还包括第二发射器,第一测距方法还包括:若被测物体至电子设备的距离小于或等于第三预设距离,利用第二发射器出射的光线被被测物体反射后,被测物体在光电ToF探测元件上的成像位置,确定被测物体至电子设备的距离,其中,第三预设距离小于第一预设距离,第一发射器与第二发射器出射光线的夹角范围为5°至10°。
类似地,为了避免出现上述测距盲区,当被测物体至电子设备的距离小于第三预设距离例如0.3米时,也可以被测物体反射的电子设备的另一个发射器发射的光线在光电ToF探测元件上的成像位置,来计算被测物体至电子设备的距离。其中,具体计算方式与三角测距法原理相同,此处不再赘述。
结合第一方面,在第一方面的一种可能的实现方式中,第二测距方法包括:利用第一发射器出射的第一光线经过被测物体反射后,被第一接收器接收的过程中所经历的时间,确定被测物体 至电子设备的距离。也即,飞行时间测距法的测距原理是利用发射器发射的经被测物体反射,并由接收器接收的过程中所经历的时间以及光速来计算被测物体至电子设备的距离,具体可参考下文关于飞行时间测距法的描述,此处不再赘述。
其中,需要说明的是,当电子设备具有两个发射器时,也可以利用第二激光器发射的光线经被测物体反射并由接收器接收的过程中所经历的时间以及光速计算被测物体至电子设备的距离,本申请对此不作限制。
第二方面,本申请实施例提供了一种光电探测模组,该模组包括第一激光雷达,用于向被测物体出射第一光线;
接收器,接收器包括光电ToF探测元件,光电ToF探测元件用于接收经被测物体反射后的第一光线;
存储器,用于存储由光电探测模组的一个或多个处理器执行的指令,以及
处理器,是光电探测模组的处理器之一,用于实现第一方面中任一中可能的实现方法中的测距方法。
结合第二方面,在第二方面的一种可能的实现方式中,在第一激光雷达上设置有分光装置,用于对第一激光雷达出射的光线进行分光处理,使第一激光雷达出射的第一光线分为第一光路和第二光路,其中,第一光路的光线与第一光线的方向相同且第一光路的光线与第二光路的方向之间的夹角范围为5°至10°。
结合第二方面,在第二方面的一种可能的实现方式中,光电探测模组还包括第二激光雷达,第一激光雷达与第二激光雷达出射光线的夹角范围为5°至10°。
第三方面,本申请实施例还提供了一种电子设备,该电子设备包括上述第二方面中任意一种光电模组,
存储器,用于存储由电子设备的一个或多个处理器执行的指令,以及
处理器,是电子设备的处理器之一,用于上述第一方面中任一种可能的实现方式中的测距方法。
第四方面,本申请实施例还提供了一种光电ToF探测芯片,光电ToF探测芯片包括:
通信接口,用于输入和/或输出信息;
处理器,用于执行计算机可执行程序,使得安装有光电ToF探测芯片的设备能够实现上述第一方面中任一中可能的实现方法中的测距方法。
结合第四方面,在第四方面的一种可能的实现方式中,光电ToF探测芯片通过通信接口输出被测物体至电子设备的距离。
第五方面,本申请还提供了一种测距方法,应用于电子设备,其特征在于,电子设备包括光电ToF探测芯片、处理器、发射器以及接收器,接收器包括光电ToF探测元件,光电ToF探测芯片支持第一测距方法和第二测距方法;
该方法包括:光电ToF探测芯片基于第一测距方法确定被测物体在光电ToF探测芯片上的成像位置,并且处理器根据确定的成像位置计算被测物体至被测物体至电子设备的第一距离,并且光电ToF探测芯片基于第二测距方法确定光线经被测物体反射后并由接收器接收到的过程中所经历的时间,并且处理器根据确定的时间计算被测物体至电子设备的第二距离;
处理器在第一距离或者第二距离小于第一预设距离的情况下,将第一距离作为测距结果,并在第一距离或者第二距离大于或等于第二预设距离的情况下,将第二距离作为测距结果,其中, 第二预设距离大于或等于第一预设距离。
其中,第一测距方法可以为三角测距法,第二测距方法可以为飞行时间测距法,第一距离为采用三角测距法测得的距离,第二距离为采用飞行时间测距法测得的距离。
也即,本申请的测距方法可由光电ToF探测芯片以及处理器配合实现,即光电ToF探测芯片利用三角测距法或飞行时间测距法测得被测物体至电子设备的距离,然后由处理器根据该距离判断所处的测距场景,并决定使用三角测距法还是飞行时间测距法测得的结果作为最终测距结果。具体地,当第一距离或者第二距离小于第一预设距离的情况下,将第一距离作为测距结果,并在第一距离或者第二距离大于或等于第二预设距离的情况下,将第二距离作为测距结果,其中,第二预设距离大于或等于第一预设距离。其中,第一预设距离和第二预设距离可参考上文相关描述,此处不再赘述。
第六方面,本申请实施例还提供了一种电子设备,该电子设备包括包括光电ToF探测芯片、处理器、发射器以及接收器,接收器包括光电ToF探测元件,光电ToF探测芯片支持第一测距方法和第二测距方法;其中,光电ToF探测芯片能够基于第一测距方法,确定被测物体在光电ToF探测元件上的成像位置,并能够基于第二测距方法,确定光线经被测物体反射后并由接收器接收到的过程中所经历的时间;
处理器能够根据确定的成像位置信息计算被测物体至电子设备的第一距离,并能够根据确定的时间计算被测物体至电子设备的第二距离,并且
处理器还能够在第一距离或者第二距离小于第一预设距离的情况下,将第一距离作为测距结果,并在第一距离或者第二距离大于或等于第二预设距离的情况下,将第二距离作为测距结果,其中,第二预设距离大于或等于第一预设距离。
第七方面,本申请实施例还提供了一种可读介质,可读存储介质存储有计算机程序,其特征在于,计算机程序被处理器执行时实现上述第一方面中任一中可能的实现方法中的测距方法。
第八方面,本申请实施例提供了一种计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行第一方面中任一中可能的实现方法中的测距方法。
可以理解的是,上述第二方面至第八方面的有益效果可以参见上述第一方面中的相关描述,在此不再赘述。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1示出了三角测距法的测距原理;
图2示出了飞行时间测距法的测距原理;
图3示出了采用飞行时间测距法针对不同被测物体时接收到的波形的信号之间的区别;
图4A示出了本申请测距方法适用的电子设备100的硬件结构示意图;
图4B示出了电子设备100的一种具体实现的结构示意图,其中,激光雷达11与接收器12设置在底座50上,底座50与电机30同轴设置;
图4C示出了图4B中激光雷达11与分光镜110之间的相对位置关系;
图4D示出电子设备100的另一种具体实现的结构示意图,其中,激光雷达11与接收器12设 置在底座50上,底座50与电机30不是同轴设置,电机30通过传送皮带40带动底座50转动;
图5示出了本申请中利用光电ToF探测元件进行计时的结构示意图;
图6示出了利用图5所示的装置得到的飞行时间的统计直方图,其中,横轴代表飞行时间,纵轴代表飞行时间出现的频率;
图7示出了本申请中的测距方法的流程示意图;
图8示出了一种实现本申请的测距方法的原理示意图;
图9示出了另一种实现本申请的测距方法的原理示意图;
图10示出了又一种实现本申请的测距方法的原理示意图;
图11示出了又一种实现本申请的测距方法的原理示意图;
图12示出了一种光电ToF探测芯片系统结构示意图。
具体实施方式
下面将使用本领域技术人员通常采用的术语来描述说明性实施例的各个方面。为了便于理解本申请的发明构思,首先介绍上文所述的三角测距法以及飞行时间测距法的原理和各自的缺点。
(1)三角测距法
三角测距法是通过设备上的激光雷达以一定的角度出射一束激光照射被测物体,激光在被测物体表面发生反射,然后由接收器的透镜对反射光汇聚,并在光电探测元件上成像成光斑,然后根据光电探测元件上的被测物体的光斑位置,来计算被测物体至设备的之间的距离。并且,当被测物体沿激光方向发生移动时,光电探测元件可检测到被测物体的光斑产生移动,且其位移大小对应被测物体的移动距离,因此可由光斑位移距离计算出被测物体的移动距离。其中,光电探测元件的可以是电荷耦合器件图像传感器(charge coupled device,CCD)或互补金属氧化物半导体图像传感器(complementary metal-oxide-semiconductor,COMS)。
具体计算过程如下:如图1所示,激光雷达11出射一束激光照射在P处的被测物体200上,经被测物体200反射后的光线成像在接收器12′的光电探测元件13′的光斑为E,当被测物体移动至P1处时,经被测物体200反射后的光线成像在光电探测元件13′的光斑为E′,激光雷达11与光电探测元件13′的连线与激光雷达11出射的激光的方向垂直,B为透镜g的中心。不难看出,图中ΔAPE(即三角形APE)与ΔOBE(即三角形OBE)相似,ΔAP1F′(即三角形AP1F′)与ΔOBE′(即三角形OBE′)相似。因此以下等式成立:
Figure PCTCN2022125660-appb-000001
Figure PCTCN2022125660-appb-000002
其中,OB=f为透镜g的焦距,AO=L是已知的,为激光雷达11与光电探测元件13′之间的距离,OE和OE′为激光经被测物体200反射至光电探测元件13′上的光斑距离透镜g中心B的距离,可以通过光电探测元件13′得到,AP=D1,AP1=D2,因此:
Figure PCTCN2022125660-appb-000003
Figure PCTCN2022125660-appb-000004
因此在P处的被测物体200至设备的距离为
Figure PCTCN2022125660-appb-000005
在P1处的被测物体200至设备距离为
Figure PCTCN2022125660-appb-000006
根据上述三角测距法的原理可知,当被测物体200至激光雷达11的距离越来越远时,被测物体200反射在光电探测元件13′上形成的光斑也会越来越小,并且光斑随被测物体移动而发生的位置改变也会越来越小,直至光电探测元件13′的分辨率不足以区分光斑的位置变化。因此,一方面,三角测距法在远距离测距时精确度较差,而如需提高三角测距法远距离测距的精确度,或者增加透镜g和光电探测元件13′之间的间距,但这样会导致接收器模组的尺寸变大,而且即使如此也仅能保证5M内的测距精度,或者提高光电探测元件13′的分辨率,但实际应用中采用分辨率更高的光电探测元件的成本远高于增大接收器模组的成本,此外分辨率的提高也是有限的。另一方面,三角测距法主要利用被测物体200的光强度信息进行成像,然后根据光斑在光电探测元件13′上的移动距离计算被测物体200至设备的距离。光强度(luminousintensity,I)信息指的是光照的强度,它是光源在某一方向立体角内之光通量大小,单位为坎德拉(candela,cd),而光通量(luminous flux)指人眼所能感觉到的辐射功率。光强度容易受到外界环境的影响,比如在环境光干扰较严重的户外强光或者室内强光的场景下,被激被测物体200反射的光线的光强度会受到影响,导致被测物体200在光电探测元件13′上的成像效果不佳,表现为光斑位置与实际位置不符,进而影响最终的测距结果。
因此三角测距法不适合远距离测距或者有强光干扰的场景,例如不太适用于有户外使用需求的物流服务类机器设备上。
(2)飞行时间测距法
飞行时间测距法则是利用激光从激光雷达11发出至被测物体200,再经被测物体200反射至接收器12′所经历的时间t以及光速c,计算激光雷达11至被测物体200的距离,也即激光雷达11至被测物体200的距离为
Figure PCTCN2022125660-appb-000007
具体原理可如图2所示,由激光雷达11发射激光脉冲,并由计时器时间将发射激光脉冲的时间记为t_start,然后当接收器12′检测到经被测物体200反射回的脉冲(波)的上升沿时,计时器将时间记为t_stop,则激光脉冲从激光雷达11射出并经被测物体200反射至接收器12′所经历的时间t=t_stop-t_start,激光雷达11与被测物体200之间的距离
Figure PCTCN2022125660-appb-000008
飞行时间测距法主要是利用光能量信息进行测距,也即飞行时间测距法只要检测到被测物体200反射回的脉冲,即可根据出射激光脉冲的时间与反射脉冲的时间确定被测物体200至设备的距离,而无需考虑被测物体在光电探测元件上的具体成像位置。其中,适用于飞行时间测距法的光电探测元件一般为光电ToF探测元件。因此,在激光雷达11出射的激光脉冲的功率和接收器12′中光电ToF探测元件灵敏度足够的情况下,飞行时间测距法的最大测距范围可达到数百米,而且测距精度不随被测物体200至激光雷达11的距离增加而下降。
但是,如上文所言,飞行时间测距法中计时器是根据检测到的反射脉冲的上升沿来计时的,由于不同物体反射激光脉冲的能力不同,如图3所示,对于激光雷达11发射出的圆波脉冲,有些物体200′反射回来的脉冲为方形脉冲波A,而有些物体200″反射回来的脉冲却为圆波脉冲B,由于不同波形上升沿的确定规则不同,这样计时器在根据反射回的不同波形的脉冲上升沿信号计时时,会出现些许误差,比如将圆波脉冲B的波峰作为“上升沿”计时相对于方方形脉冲波A的 上升沿计时会有延迟。而且这种误差在近距离测距场景下更为明显,因为远距离测距场景(比如大于100米)时,反射脉冲能量会发生衰减,不同物体反射光的能力对反射光脉冲波形状影响不大,大部分物体反射回的脉冲波形状都类似于圆波B,而在近距离测距(比如在1米以内)时,反射光脉冲能量衰减较弱,被测物体200反射光的能力对激光脉冲的波形影响更为明显,也即更容易出现上述的波形信号失真的情况,因此对计时器计时效果影响也比较明显,进而导致飞行时间测距法在近距离测距场景下测距的精确度会下降。需要说明的是,上述关于激光雷达11发射的脉冲波形以及接收器12′接收的反射回的脉冲波形仅为示例性的,并不构成对本申请的限制。在一些实施例中,激光雷达11也可以发射方形脉冲波或者其他波形的脉冲波,本申请对此不作限制。
因此飞行时间测距法不太适用近距离测距,例如不太适合应用在家用扫地机器人等经常应用在贴墙延边或小目标避障的设备上。
为了扩大测距方法的适用场景,使其同时满足近距离测距场景和远距离测距场景的需求,本申请提供了一种测距方法。在本申请的测距方法中,根据被测物体至设备之间的距离,决定采用三角测距法还是飞行时间测距法测得的被测物体至设备之间的距离作为测距结果。具体地,电子设备可利用三角测距法和/或飞行时间测距法不断判断被测物体至电子设备的距离,在被测物体至电子设备的距离较远时(也即远距离测距场景时),电子设备采用更适用远距离测距场景的飞行时间测距法测得距离作为测距结果,在被测物体至电子设备的距离较近时(也即近距离测距场景时),电子设备采用更适用近距离测距场景的三角测距法测得的距离作为测距结果,以确保无论是近距离测距还是远距离测距,电子设备的测距结果都比较精确。
在一些实现方式中,可通过设置第一预设距离来区分当前测距场景是近距离测距场景还是远距离测距场景,比如在被测物体至电子设备的距离小于第一预设距离时,为近距离测距场景,在被测物体至电子设备的距离大于或等于第二预设距离时,为远距离测距场景。其中,第一预设距离的设置可以为经验值或实验值,例如为1米,第二预设距离的设置也可以为经验值或实验值,例如100米。在一些实现方式中,第二预设距离也可以等于第一预设距离,也即在被测物体至电子设备的距离大于或等于第一预设距离时,为远距离测距场景。
在一些实现方式中,上述电子设备100可以为智能扫地机器人、物流服务机器人、自动驾驶汽车等,本申请对电子设备100的类型不作任何限制。
下面结合附图4至图9介绍本申请测距方法的具体实现过程。
图4A示出了能够实现本申请测距方法的电子设备100的硬件结构示意图。
如图4A所示,电子设备100的硬件结构包括处理器10、存储器20、电源模块30、激光雷达11、接收器12、计时器14。
其中,接收器12包括光电ToF探测元件13,接收器12用于接收经被测物体200反射回的激光,该激光将在光电ToF探测元件13上成像,光电ToF探测元件13可将反射激光的成像位置信息输出至处理器10,以便处理器10根据反射激光成像位置信息确定被测物体200至电子设备100的距离。同时,光电ToF探测元件13还将接收到的反射激光的电信号传输至计时器14,以便于计时器14记录被测物体200至电子设备100的飞行时间,具体将在下文结合计时器14进行介绍。
在一些实现方式中,光电ToF探测元件13可以是由单光子雪崩二极管(single photon avalanche diode,SPAD)构成的M×N的二维阵列,也即如图5所示的SPAD阵列131。使用SPAD的好处在于其灵敏度高于其他光电二极管。具体地,在SPAD工作电压升高时,其内部电场不断增加至某一特定临界值时,吸收光子后产生的电子空穴对能够碰撞产生新的电子空穴对,这一连锁 反应如同雪崩一般可将光子信号放大,使SPAD工作在盖革模式下,也即只要有一个光子被接收器12接收并产生了电子就会产生一个很大的电流/电压脉冲信号表征此次探测事件发生。可以理解,在其他实现方式中,光电ToF探测元件13也可以是由其他光子二极管构成的阵列,本申请对此不作限制。
在另一些实现方式中,光电ToF探测元件13可以为长条形,且其长宽比可以为4∶3、2∶1或者3∶1,本申请对此也不做限制。
存储器20可以是该电子设备100中的内部存储单元,例如机器人100的硬盘或者内存。存储器20也可以是外部存储设备,例如机器人上配备的插接式硬盘、智能存储卡(smart media card,SMC),安全数字(secure digital,SD)卡,闪存卡(Flash Card)等。
进一步地,存储器20还可以既包括内部存储单元也包括外部存储设备。存储器20用于存储计算机程序21以及系统所需要的其他数据和程序。存储器20还可以用于暂时地存储已经输出或者将要输出的数据。计算机程序21包括计算机程序代码,可以是源代码形式、对象代码形式、可执行文件或者某些中间形式等。
计算机可读存储介质可以包括能够携带计算机程序代码的任何实体或者装置、记录介质、U盘、移动硬盘、磁碟、光盘、计算机存储器、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、电载波信号、电信信号以及软件分发介质等。处理器10可以执行存储器20中存储的计算机程序21以实现本申请提供的测距方法。
示例性的,计算机程序21可以被分割成一个或多个模块/单元,一个或多个模块/单元被存储在存储器20中,并由处理器10执行,以本申请下文中各个实施例的方法。一个或者多个模块/单元可以是能够完成特定功能的一系列计算机程序指令段,该指令段用于描述计算机程序21在该机器人中的执行过程。
在一些实现方式中,处理器10可以用来根据光电ToF探测元件13上被测物体的成像位置计算出被测物体200至电子设备的距离,或者根据接收器12接收到的被测物体反射200回的脉冲的时间以及激光雷达11出射激光脉冲的时间,确定被测物体200至电子设备的距离。
处理器10可以是中央处理单元(central processing unit,CPU),还可以是其他通用处理器、数字信号处理器(digital signal processor,DSP)、专用集成电路(application specific integrate circuit,ASIC)、现场可编辑门阵列(field-programmable gate array,FPGA)或者其他可编程逻辑器件、分立门或者警惕管理逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
处理器10还可以根据所测得被测物体至设备之间的距离,控制电子设备100的移动或激光雷达11转向,例如控制电子设备100的舵机向左、向右、向前、向后转,或者控制上述激光雷达11按照一定的频率旋转。在一些实现方式中,电子设备100也可以包括控制器(图中未示出),然后由控制器根据处理器10的指令,控制电子设备100的移动或激光雷达11的转向,本申请对此不作限制。
激光雷达11用于对当前场景进行实时信息采集以获得当前场景的激光扫描信息,处理器10根据激光扫描信息(例如,激光雷达扫描到的物体的相对于电子设备100的距离信息、物体的轮廓信息等等)进行分析处理,例如,处理器10将激光扫描信息与存储器20内存储的预设的场景信息进行对比,确定当前场景内是否存在障碍物,并根据判断结果对电子设备100的行进路线进行调整,以避免造成碰撞等事故的发生。在本申请的一些实施例中,激光雷达12还可以结合3D 视觉地图,用于设备进行定位,例如用于智能扫地机器人进行定位。具体地,电子设备100可以将激光雷达12扫描到的障碍物轮廓信息和3D视觉地图中的障碍物信息进行比对,来更加精准地避开障碍物。
在本申请的一些实施例中,激光雷达11用于以一定的出射角度出射激光脉冲。其中,激光雷达11出射的激光脉冲可以是波长范围为780纳米至940纳米的红外激光脉冲,例如波长为905纳米的红外激光脉冲。
并且,在一些实现方式中,设备可以具有多个激光雷达,例如激光雷达11和激光雷达11′,二者出射的激光之间夹角范围在4°至15°之间,例如5°或10°。这样做的目的在于,避免在近距离测距场景采用三角测距法时出现测距盲区,也即由于被测物体至设备之间的距离过近,导致被测物体的成像位置超出光电ToF探测元件13的可检测范围。下文将对此展开介绍。
类似地,为了降低设备的造价成本,如图4B至图4C所示,也可直接在现有激光雷达11上增设分光镜110对激光雷达11出射的激光进行分光处理,得到第一光路的激光和第二光路的激光,其中第一光路的激光出射角度与激光雷达11未进行分光处理的出射角度一致,第二光路的激光与第一光路的激光的夹角范围α为4°至15°,例如5°或10°。这种方式也可以避免在近距离测距场景采用三角测距法时出现的测距盲区。下文将对此展开介绍。
在一些实现方式中,如图4B所示,激光雷达11和接收器12固定在底座50上,底座50与电机30同轴设置,由处理器11控制电机30旋转并带动底座50同步旋转,进而带动激光雷达11和接收器12旋转,以获得360°的扫描视场角(Field of view,FoV)。在另一些实现方式中,如图4D所示,激光雷达11与接收器12的底座50也可以与电机30不同轴设置,然后通过传送皮带40的方式由电机30带动底座50旋转,进而带动激光雷达11和接收器12旋转,本申请对此不作限制。
在一些实现方式中,上述处理器10、存储器20、激光雷达11以及接收器12可构成光电探测模组,用于实现对被测物体的距离的检测。
计时器(time-to-digital converter,TDC)14,也称为时间数字转换电路。在本申请的实现方式中,用于记录采用飞行时间测距法测距时所需的飞行时间。具体地,在激光雷达11出射激光时,设备内的驱动电路(图中未示出)会生成触发信号,触发信号同时会被作为飞行时间测量的起始时间点t start。然后如图5所示,当被测物体200反射回来的光被接收器12中的SPAD阵列13中的任意一个SPAD 131探测到时,SPAD阵列13会生成电压脉冲信号(例如上述方波型脉冲A信号或圆波型脉冲B信号)以表征飞行时间探测完成时间点t_stop,然后将该电压脉冲信号输出至计时器14,由计时器14记录(t_stop-t_start)并将其转换为二进制数字信号。
在本申请的一些实现方式中,为了使飞行时间更加精确,可以利用SPAD阵列13统计多次飞行时间,然后将出现频率最高的一次飞行时间作为最终的飞行时间。可以理解,在某一时刻,SPAD阵列13中检测到被测物体反射回的激光的SPAD 131的数量越多,表明该时刻被测物体反射回的光的能量越强,对应的,该时刻对应的飞行时间出现的频率也会越高(因为同时有多个SPAD 131检测到反射光),也即该时刻对应的飞行时间最有可能是激光从出射经被测物体反射回接收器12所经历的飞行时间。例如图6示出了SPAD阵列13的飞行时间统计直方图,横轴代表飞行时间,纵轴代表该飞行时间出现的频率,经过波形拟合,可知图6中飞行时间t12的频率最高,因此激光从出射到反射回接收器12所经历的飞行时间即为t11。在其他实现方式中,也可以采用其他统计方法确定飞行时间,本申请对此不作限制。
电源模块30可以包括电源、电源管理部件等。电源可以为电池。在一些实现方式中,电源管理部件包括充电管理模块和电源管理模块。充电管理模块用于从充电器接收充电输入;电源管理模块用于连接电源、充电管理模块与处理器10。电源管理模块接收电源和/或充电管理模块的输入,为处理器10、激光雷达11等供电。
本领域技术人员可以理解,图4A仅仅是该电子设备100硬件结构的一个示例,并不构成对电子设备100的硬件结构的限定,在其他硬件结构中可以包括比图2更多或者更少的部件,例如,还可以包括输入输出设备、网络接入设备、总线、通信模块等。
下面以图7为例介绍本申请测距方法的具体实现过程。图7所示的测距方法由上述电子设备100的处理器执行。该方法包括:
701,利用第一测距方法和/或第二测距方法判断被测物体200至电子设备100的距离是否超出第一预设距离。这样做的目的是为了判断被测物体200至电子设备100的距离属于近距离测距场景还是远距离测距场景,以便根据具体的测距场景,确定采用哪种测距方法测得的距离作为最终测距结果。在一些实现方式中,第一预设距离为经验值或试验值,第一预设距离取值例如可以为1米,也即在被测物体200至电子设备100的距离小于1米的测距场景,属于近距离测距场景,在被测物体200至电子设备100的距离大于1米的测距场景,属于远距离测距场景。并且第一测距方法可以为三角测距法,第二测距方法可以为飞行时间测距法。
在确定当前测距的场景属于近距离测距场景或者远距离测距场景后,电子设备100可根据需求确定采用更适合近距离测距场景的三角测距法测得的距离作为测距结果,还是采用更适合远距离测距场景的飞行时间测距法。
需要说明的是,在实际应用中,关于近距离测距场景和远距离测距场景的区分并非仅由一个距离数值来确定,区分近距离测距场景和远距离测距场景的往往是一个距离范围,比如在小于这个距离范围的最小值时,属于近距离测距场景,在大于或等于这个距离范围的最大值时,属于远距离测距场景,在二者之间时,两种测距方法的精确度差别不大,因此为了节省电子设备100功耗,这种情况下,可以不切换测距方法,或者输出采用第一测距方法和第二测距方法确定的被测物体至电子设备的距离中的任意一个作为测距结果,本申请对此不作限制。
在具体判断被测物体200至电子设备100的距离时,既可用三角测距法判断被测物体200至电子设备100的距离,也可利用飞行时间测距法判断被测物体200至电子设备100的距离,还可以结合三角测距法和飞行时间测距法共同判断被测物体200至电子设备100的距离,本申请对此不作限制。
在一些实现方式中,受限于三角测距法的原理,当被测物体200至电子设备100的距离过近时,例如当其位于图1中的P2处时,被测物体200的成像位置就超出了光电探测元件13′的可检测范围,也即这种情况下电子设备无法在利用三角测距法判断被测物体200的至电子设备100的距离,也即所谓的“测距盲区”。“测距盲区”表现在本申请方案中,即如图8所示,被测物体200在P4处时,在激光雷达11沿光路1出射的光线下,其成像光斑2的位置会超出光电ToF探测元件13的可检测范围,导致电子设备100无法利用三角测距法测量被测物体200至电子设备100的距离。
为了避免出现测距盲区,电子设备200可在检测到被测物体200至电子设备100的距离小于或等于第三预设距离时,启动设置在其上的第二激光雷达11′并利用第二激光雷达11′探测被测物体200,其中第二激光雷达11′出射光线能够保证被测物体200在小于或等于第三预设距离时 的成像位置不会超出光电探测元件13可检测范围。其中,第三预设距离小于上述第一预设距离,并且其取值也可以为经验值或实验值,例如第三预设距离取值可为0.3米。
在一些实现方式中,可通过设计第二激光雷达出射光线的角度,使被测物体200在小于或等于第三预设距离时的成像位置不会超出光电ToF探测元件13的可检测范围,例如如上图4A中所言的,第二激光雷达出射光线的角度可设计为与第一激光雷达出射光线的角度之间的夹角为5°至10°。比如图9所示,第二激光雷达出射光线的光路3与第一激光雷达出射光线的光路1之间的夹角α取值为10°。
为了尽可能降低电子设备或上述光电测距模组的造价成本,在另一些实现方式中,如上图4A中所言的,也可直接在第一激光雷达11上设置分光镜110,对第一激光雷达11出射的光线进行分光处理,得到两路不同的光线,其中第一光路光线保持原出射方向,第二路光线与原出射方向之间的夹角范围为5°至10°,比如图10所示的,经过分光镜110分光处理后,第一激光雷达出射的光路4经过分光处理后,得到光路41、光路42,光路41与光路42之间的夹角β为10°,在被测物体200至电子设备100的距离小于第三预设距离时,被测物体200可在第二光路光线下成像至光电ToF探测元件13上,然后由光电ToF探测元件13检测被测物体200的成像光斑位置,以根据被测物体200成像光斑位置利用三角测距法判断此时被测物体200至电子设备100距离是否超出第一预设距离。
702,若被测物体200至电子设备100的距离小于第一预设距离,将采用第一测距方法确定的被测物体200至电子设备100的距离作为测距结果。也即在第一预设距离内,第一测距方法测得的被测物体200至电子设备100距离的精确度要比第二测距方法测得的被测物体200至电子设备100的距离的精确度高,因此采用第一测距方法测距的被测物体200至电子设备100的距离作为最终的测距结果。
例如,利用三角测距法检测到的被测物体200至电子设备100的距离为0.789米,利用飞行时间测距法检测到的被测物体200至电子设备100的距离为0.8米,小于上述第一预设距离1米,但是三角测距法测得的被测物体200至电子设备100的距离明显比飞行时间测距法测得的被测物体200至电子设备100的距离的精确度要高,因此在这种情况下,采用三角测距法检测到的被测物体200至电子设备100的距离作为测距结果。
其中,在小于第一预设距离(例如1米)的情况下,采用三角测距法测距的原理可如图11所示,即经被测物体200反射回的激光由接收器12接收,并成像在上述的光电ToF探测元件13上,然后电子设备100可根据光电ToF探测元件13输出的反射激光成像光斑的位置信息,利用图1所示的三角测距法确定被测物体200至电子设备100的距离。
703,若被测物体200至电子设备100的距离大于或等于第二预设距离,将采用第一测距方法确定的被测物体200至电子设备100的距离作为测距结果。其中,第二预设距离大于或等于第一预设距离。也即在超出第一预设距离的情况下,第二测距方法测得的被测物体200至电子设备100距离的精确度要比第一测距方法测得的被测物体200至电子设备100的距离的精确度高,因此采用第二测距方法测距的被测物体200至电子设备100的距离作为最终的测距结果。
例如,利用三角测距法检测到的被测物体200至电子设备100的距离为59米,利用飞行时间测距法检测到的被测物体200至电子设备100的距离为58.799米,大于上述第一预设距离1米,但是飞行时间测距法测得的被测物体200至电子设备100的距离明显比三角测距法测得的被测物体200至电子设备100的距离的精确度要高,因此在这种情况下,采用飞行时间测距法检测到的 被测物体200至电子设备100的距离作为测距结果。
图12根据本申请的一些实施例,示出了一种光电ToF探测芯片系统1200示意图。该光电ToF探测芯片1200可应用在上图4A所示的电子设备100上,以执行本申请上述各个实现方式中的测距方法。
如图12所示,光电ToF探测芯片系统1200可以包括一个或多个处理器1210、系统内存1202、非易失性存储器(Non-Volatile Memory,NVM)1203、输入/输出(I/O)设备1204、通信接口1205以及用于耦接处理器1210、系统内存1202、非易失性存储器1203、通信接口1204和输入/输出(I/O)设备1205的系统控制逻辑1206。其中:处理器1210可以包括一个或多个单核或多核处理器。在一些实施例中,处理器1210可以包括通用处理器和专用处理器(例如,图形处理器,应用处理器,基带处理器等)的任意组合。在一些实施例中,处理器1210可以用于实现图7所示的实施例提供的测距方法。
系统内存1202是易失性存储器,例如随机存取存储器(Random-Access Memory,RAM),双倍数据率同步动态随机存取存储器(Double Data Rate Synchronous Dynamic Random Access Memory,DDR SDRAM)等。系统内存用于临时存储数据和/或指令,例如,在一些实施例中,系统内存1202可以用于存储上述用于实现测距方法的可执行程序。
非易失性存储器12012可以包括用于存储数据和/或指令的一个或多个有形的、非暂时性的计算机可读介质。在一些实施例中,非易失性存储器1203可以包括闪存等任意合适的非易失性存储器和/或任意合适的非易失性存储设备,例如硬盘驱动器(Hard Disk Drive,HDD)、光盘(Compact Disc,CD)、数字通用光盘(Digital Versatile Disc,DVD)、固态硬盘(Solid-State Drive,SSD)等。在一些实施例中,非易失性存储器12012也可以是可移动存储介质,例如安全数字(Secure Digital,SD)存储卡等。
特别地,系统内存1202和非易失性存储器12012可以分别包括:指令1207的临时副本和永久副本。指令1207可以包括:由处理器1210中的至少一个执行时使电子设备1001200实现本申请各实施例提供的测距方法的程序指令。
输入/输出(I/O)设备1204可以包括用户界面,使得用户能够与电子设备1001200进行交互。例如,在一些实施例中,输入/输出(I/O)设备1204可以包括显示器等输出设备,用于显示电子设备1001200中的保险管理系统界面,还可以包括键盘、鼠标、触摸屏等输入设备。产品开发人员可以通过用户界面以及键盘、鼠标、触摸屏等输入设备与电子设备1001200进行交互。
通信接口1205,用于为电子设备1001200提供有线或无线通信接口,进而通过一个或多个网络与任意其他合适的设备进行通信。在一些实施例中,通信接口1205可以集成于光电ToF探测芯片系统120的其他组件,例如通信接口1205可以集成于处理器1210中。在一些实施例中,电子设备1001200可以通过通信接口1205和其他设备通信。在一些实施例中,光电ToF探测芯片系统1200通过通信接口1205输出其采用上述测距方法测得的被测物体200至设备的距离。
系统控制逻辑1206可以包括任意合适的接口控制器,以电子设备1001200的其他模块提供任意合适的接口。例如在一些实施例中,系统控制逻辑1206可以包括一个或多个存储器控制器,以提供连接到系统内存1202和非易失性存储器12012的接口。
在一些实施例中,处理器1210中的至少一个可以与用于系统控制逻辑1206的一个或多个控制器的逻辑封装在一起,以形成系统封装(System in Package,SiP)。在另一些实施例中,处理器1210中的至少一个还可以与用于系统控制逻辑1206的一个或多个控制器的逻辑集成在同一 芯片上,以形成片上系统(System-on-Chip,SoC)。
可以理解,本申请实施例示出的芯片系统的结构并不构成对芯片系统的具体限定。在本申请另一些实施例中,芯片系统可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
此外,本申请还提供了一种测距方法,该方法可应用于电子设备,该电子设备包括光电ToF探测芯片、处理器、激光雷达以及接收器,其中,接收器还包括上图4A所示的光电ToF探测元件,该光电ToF探测元件也可支持上述两种测距方法。该方法包括:
光电ToF探测芯片基于第一测距方法确定被测物体在光电ToF探测芯片上的成像位置,并且处理器根据确定的成像位置计算被测物体至被测物体至电子设备的第一距离,并且
光电ToF探测芯片基于第二测距方法确定光线经被测物体反射后并由接收器接收到的过程中所经历的时间,并且处理器根据确定的时间计算被测物体至电子设备的第二距离。
处理器在第一距离或者第二距离小于第一预设距离的情况下,将第一距离作为测距结果,并在第一距离或者第二距离大于或等于第二预设距离的情况下,将第二距离作为测距结果,其中,第二预设距离大于或等于第一预设距离。
其中,如上文所述的,第一测距方法可以是三角测距法,第二测距方法可以是飞行时间测距法,相应的,第一距离为采用三角测距法测距的被测物体至电子设备的距离,第二距离为采用飞行时间测距法测得的被测物体至电子设备的距离。其中,第一测距方法以及第二测距方法的具体实现过程可参考上文相关描述,此处不再赘述。
当光电ToF探测芯片采用三角测距法或飞行时间测距法检测到被测物体至电子设备的距离后,由光电ToF探测芯片将距离发送至电子设备的处理器,并由该电子设备的处理器根据接收到的距离,确定当前测距场景属于近距离测距场景还是远距离测距场景。如果第一距离或第二距离小于第一预设距离,表明当前处于近距离测距场景,则采用三角测距法测得的第一距离作为测距结果,当第一距离或第二距离大于或等于第二预设距离,表明当前处理远距离测距场景,则采用飞行时间测距法测得的第二距离作为测距结果,以保证电子设备在近距离测距场景或远距离测距场景下测距结果的精确度。
对应于上述测距方法,本申请还提供了另一种电子设备,其中,与上述测距方法相同的描述可参考上文相关描述,以下不再赘述。具体地,该电子设备包括光电ToF探测芯片,处理器、激光雷达、接收器,其中接收器包括光电ToF探测芯片,并且该光电ToF探测芯片可支持第一测距方法和第二测距方法。其中,光电ToF探测芯片能够基于第一测距方法,确定被测物体在光电ToF探测元件上的成像位置,并能够基于第二测距方法,确定光线经被测物体反射后并由接收器接收到的过程中所经历的时间;
处理器能够根据确定的成像位置信息计算被测物体至电子设备的第一距离,并能够根据确定的时间计算被测物体至电子设备的第二距离,并且
处理器还能够在第一距离或者第二距离小于第一预设距离的情况下,将第一距离作为测距结果,并在第一距离或者第二距离大于或等于第二预设距离的情况下,将第二距离作为测距结果,其中,第二预设距离大于或等于第一预设距离。
此外,本申请实施例还提供了一种计算机可读存储介质,计算机可读存储介质存储有计算机程序,计算机程序被处理器执行时实现可实现上述各个方法实施例中的步骤。
本申请实施例提供了一种计算机程序产品,当计算机程序产品在电子设备100上运行时,使 得电子设备100执行时实现可实现上述各个方法实施例中的步骤。
本申请实施例还提供了一种电子设备100,该电子设备100包括:的光电探测模组,至少一个处理器、存储器以及存储在存储器中并可在至少一个处理器上运行的计算机程序,处理器执行计算机程序时实现上述任意各个方法实施例中的步骤。在一些实施例中,上述光电探测模组自身可包括处理器、存储器等设备,也即测距方法可以由光电探测模组的处理器以及存储器配合实现,也可以由光电探测模组发送至电子设备的处理器来实现,本申请对此不作限制。
本申请公开的机制的各实施例可以被实现在硬件、软件、固件或这些实现方法的组合中。本申请的实施例可实现为在可编程系统上执行的计算机程序或程序代码,该可编程系统包括至少一个处理器、存储系统(包括易失性和非易失性存储器和/或存储元件)、至少一个输入设备以及至少一个输出设备。
可将程序代码应用于输入指令,以执行本申请描述的各功能并生成输出信息。可以按已知方式将输出信息应用于一个或多个输出设备。为了本申请的目的,处理系统包括具有诸如例如数字信号处理器(Digital Signal Processor,DSP)、微控制器、专用集成电路(Application Specific Integrated Circuit,ASIC)或微处理器之类的处理器的任何系统。
程序代码可以用高级程序化语言或面向对象的编程语言来实现,以便与处理系统通信。在需要时,也可用汇编语言或机器语言来实现程序代码。事实上,本申请中描述的机制不限于任何特定编程语言的范围。在任一情形下,该语言可以是编译语言或解释语言。
在一些情况下,所公开的实施例可以以硬件、固件、软件或其任何组合来实现。所公开的实施例还可以被实现为由一个或多个暂时或非暂时性机器可读(例如,计算机可读)存储介质承载或存储在其上的指令,其可以由一个或多个处理器读取和执行。例如,指令可以通过网络或通过其他计算机可读介质分发。因此,机器可读介质可以包括用于以机器(例如,计算机)可读的形式存储或传输信息的任何机制,包括但不限于,软盘、光盘、光碟、只读存储器(CD-ROMs)、磁光盘、只读存储器(Read Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、可擦除可编程只读存储器(Erasable Programmable Read Only Memory,EPROM)、电可擦除可编程只读存储器(Electrically Erasable Programmable Read-Only Memory,EEPROM)、磁卡或光卡、闪存、或用于利用因特网以电、光、声或其他形式的传播信号来传输信息(例如,载波、红外信号数字信号等)的有形的机器可读存储器。因此,机器可读介质包括适合于以机器(例如计算机)可读的形式存储或传输电子指令或信息的任何类型的机器可读介质。
在附图中,可以以特定布置和/或顺序示出一些结构或方法特征。然而,应该理解,可能不需要这样的特定布置和/或排序。而是,在一些实施例中,这些特征可以以不同于说明性附图中所示的方式和/或顺序来布置。另外,在特定图中包括结构或方法特征并不意味着暗示在所有实施例中都需要这样的特征,并且在一些实施例中,可以不包括这些特征或者可以与其他特征组合。
需要说明的是,本申请各设备实施例中提到的各单元/模块都是逻辑单元/模块,在物理上,一个逻辑单元/模块可以是一个物理单元/模块,也可以是一个物理单元/模块的一部分,还可以以多个物理单元/模块的组合实现,这些逻辑单元/模块本身的物理实现方式并不是最重要的,这些逻辑单元/模块所实现的功能的组合才是解决本申请所提出的技术问题的关键。此外,为了突出本申请的创新部分,本申请上述各设备实施例并没有将与解决本申请所提出的技术问题关系不太密切的单元/模块引入,这并不表明上述设备实施例并不存在其它的单元/模块。
需要说明的是,在本专利的示例和说明书中,诸如第一和第二等之类的关系术语仅仅用来将 一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个”限定的要素,并不排除在包括要素的过程、方法、物品或者设备中还存在另外的相同要素。虽然通过参照本申请的某些优选实施例,已经对本申请进行了图示和描述,但本领域的普通技术人员应该明白,可以在形式上和细节上对其作各种改变,而不偏离本申请的精神和范围。

Claims (15)

  1. 一种测距方法,其特征在于,应用于电子设备,所述电子设备包括第一发射器和第一接收器,所述第一接收器包括光电飞行时间ToF探测元件,所述光电ToF探测元件支持第一测距方法和第二测距方法,所述方法包括:利用第一测距方法和/或第二测距方法判断所述被测物体至所述电子设备的距离是否超出第一预设距离;
    若所述被测物体至所述电子设备的距离小于所述第一预设距离,则将采用所述第一测距方法确定的所述被测物体至所述电子设备的距离作为测距结果;或
    若被测物体至所述电子设备的距离大于或等于所述第二预设距离,则将采用所述第二测距方法确定的所述被测物体至所述电子设备的距离作为所述测距结果,其中,第二预设距离大于或等于第一预设距离。
  2. 根据权利要求1所述的方法,其特征在于,所述光电ToF探测元件的形状为长条形,所述光电ToF探测元件的长宽比大于等于3。
  3. 根据权利要求1或2所述的方法,其特征在于,所述第一测距方法包括:
    利用所述第一发射器出射的第一光线被被测物体反射后,在所述光电ToF探测元件上的成像位置,确定所述被测物体至所述电子设备的距离。
  4. 根据权利要求3所述的方法,其特征在于,所述第一测距方法还包括:
    若被测物体至所述电子设备的距离小于或等于第三预设距离,对所述第一发射器出射的第一光线进行分光处理,使第一发射器出射的第一光线分为第一光路和第二光路,利用所述第二光路的光线被被测物体反射后,在所述光电ToF探测元件上的成像位置,确定所述被测物体至所述电子设备的距离,其中,
    所述第一光路的光线与所述第一光线的方向相同且所述第一光路的光线与所述第二光路的方向之间的夹角范围为5°至10°,所述第三预设距离小于所述第一预设距离。
  5. 根据权利要求3所述的方法,其特征在于,所述电子设备还包括第二发射器,所述第一测距方法还包括:
    若被测物体至所述电子设备的距离小于或等于第三预设距离,利用所述第二发射器出射的光线被被测物体反射后,所述被测物体在所述光电ToF探测元件上的成像位置,确定所述被测物体至所述电子设备的距离,其中,所述第三预设距离小于所述第一预设距离,所述第一发射器与所述第二发射器出射光线的夹角范围为5°至10°。
  6. 根据权利要求1至5中任一项所述的方法,其特征在于,所述第二测距方法包括:
    利用所述第一发射器出射的第一光线经过所述被测物体反射后,被所述第一接收器接收的过程中所经历的时间,确定所述被测物体至所述电子设备的距离。
  7. 一种光电探测模组,其特征在于,包括:
    第一激光雷达,用于向被测物体出射第一光线;
    接收器,所述接收器包括光电ToF探测元件,所述光电ToF探测元件用于接收经被测物体反射后的第一光线;
    存储器,用于存储由所述光电探测模组的一个或多个处理器执行的指令,以及
    处理器,是所述光电探测模组的处理器之一,用于执行权利要求1至6中任一项所述的测距方法。
  8. 根据权利要求8所述的光电探测模组,其特征在于,在所述第一激光雷达上设置有分光装 置,用于对所述第一激光雷达出射的光线进行分光处理,使第一激光雷达出射的第一光线分为第一光路和第二光路,其中,所述第一光路的光线与所述第一光线的方向相同且所述第一光路的光线与所述第二光路的方向之间的夹角范围为5°至10°。
  9. 根据权利要求8所述的光电探测模组,其特征在于,所述光电探测模组还包括第二激光雷达,所述第一激光雷达与所述第二激光雷达出射光线的夹角范围为5°至10°。
  10. 一种电子设备,其特征在于,所述电子设备包括权利要求8至10中任一项所述的光电探测模组,
    存储器,用于存储由电子设备的一个或多个处理器执行的指令,以及
    处理器,是电子设备的处理器之一,用于执行权利要求1至6中任一项所述的测距方法。
  11. 一种光电ToF探测芯片,其特征在于,所述光电ToF探测芯片包括:
    通信接口,用于输入和/或输出信息;
    处理器,用于执行计算机可执行程序,使得安装有所述光电ToF探测芯片的设备执行权利要求1至6中任一项所述的测距方法。
  12. 根据权利要求11所述的光电ToF探测芯片,其特征在于,所述光电ToF探测芯片通过所述通信接口输出所述被测物体至所述电子设备的距离。
  13. 一种测距方法,应用于电子设备,其特征在于,所述电子设备包括光电ToF探测芯片、处理器、发射器以及接收器,所述接收器包括光电ToF探测元件,所述光电ToF探测芯片支持第一测距方法和第二测距方法;
    所述方法包括:
    所述光电ToF探测芯片基于第一测距方法确定被测物体在光电ToF探测芯片上的成像位置,并且所述处理器根据确定的所述成像位置计算所述被测物体至所述被测物体至电子设备的第一距离,并且
    光电ToF探测芯片基于第二测距方法确定光线经被测物体反射后并由接收器接收到的过程中所经历的时间,并且所述处理器根据确定的所述时间计算所述被测物体至所述电子设备的第二距离;
    所述处理器在所述第一距离或者第二距离小于第一预设距离的情况下,将所述第一距离作为测距结果,并在所述第一距离或者第二距离大于或等于所述第二预设距离的情况下,将所述第二距离作为测距结果,其中,第二预设距离大于或等于第一预设距离。
  14. 一种电子设备,其特征在于,包括光电ToF探测芯片、处理器、发射器以及接收器,所述接收器包括光电ToF探测元件,所述光电ToF探测芯片支持第一测距方法和第二测距方法;其中,
    所述光电ToF探测芯片能够基于第一测距方法,确定被测物体在光电ToF探测元件上的成像位置,并能够基于第二测距方法,确定光线经被测物体反射后并由接收器接收到的过程中所经历的时间;
    所述处理器能够根据确定的所述成像位置信息计算所述被测物体至电子设备的第一距离,并能够根据确定的所述时间计算所述被测物体至所述电子设备的第二距离,并且
    所述处理器还能够在所述第一距离或者第二距离小于第一预设距离的情况下,将所述第一距离作为测距结果,并在所述第一距离或者第二距离大于或等于第二预设距离的情况下,将所述第二距离作为所述测距结果,其中,第二预设距离大于或等于第一预设距离。
  15. 一种可读介质,其特征在于,所述可读介质上存储有指令,该指令在电子设备上执行时使电子设备执行权利要求1至6中任一项所述的测距方法。
PCT/CN2022/125660 2022-02-24 2022-10-17 测距方法、光电探测模组、芯片、电子设备及介质 WO2023159974A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210175433.5A CN116699621A (zh) 2022-02-24 2022-02-24 测距方法、光电探测模组、芯片、电子设备及介质
CN202210175433.5 2022-02-24

Publications (1)

Publication Number Publication Date
WO2023159974A1 true WO2023159974A1 (zh) 2023-08-31

Family

ID=87764591

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/125660 WO2023159974A1 (zh) 2022-02-24 2022-10-17 测距方法、光电探测模组、芯片、电子设备及介质

Country Status (2)

Country Link
CN (1) CN116699621A (zh)
WO (1) WO2023159974A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104730532A (zh) * 2013-12-18 2015-06-24 Lg电子株式会社 距离测量装置及其方法
CN105572684A (zh) * 2014-11-05 2016-05-11 日立-Lg数据存储韩国公司 距离测量设备
JP2019197428A (ja) * 2018-05-10 2019-11-14 株式会社イージスモスジャパン 障害物検知システム
CN210534336U (zh) * 2019-08-08 2020-05-15 厦门市和奕华光电科技有限公司 一种激光雷达
CN212623087U (zh) * 2020-05-29 2021-02-26 上海擎朗智能科技有限公司 一种激光测距装置及机器人
WO2021085128A1 (ja) * 2019-10-28 2021-05-06 ソニーセミコンダクタソリューションズ株式会社 測距装置、測定方法、および、測距システム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104730532A (zh) * 2013-12-18 2015-06-24 Lg电子株式会社 距离测量装置及其方法
CN105572684A (zh) * 2014-11-05 2016-05-11 日立-Lg数据存储韩国公司 距离测量设备
JP2019197428A (ja) * 2018-05-10 2019-11-14 株式会社イージスモスジャパン 障害物検知システム
CN210534336U (zh) * 2019-08-08 2020-05-15 厦门市和奕华光电科技有限公司 一种激光雷达
WO2021085128A1 (ja) * 2019-10-28 2021-05-06 ソニーセミコンダクタソリューションズ株式会社 測距装置、測定方法、および、測距システム
CN212623087U (zh) * 2020-05-29 2021-02-26 上海擎朗智能科技有限公司 一种激光测距装置及机器人

Also Published As

Publication number Publication date
CN116699621A (zh) 2023-09-05

Similar Documents

Publication Publication Date Title
CN110609293B (zh) 一种基于飞行时间的距离探测系统和方法
CN108445506B (zh) 一种提高激光雷达透雾性的测量方法
WO2022126427A1 (zh) 点云处理方法、点云处理装置、可移动平台和计算机存储介质
JP2021510417A (ja) 階層化されたパワー制御によるlidarベースの距離測定
US11828874B2 (en) Electronic apparatus and method
CN113970757A (zh) 一种深度成像方法及深度成像系统
WO2021051281A1 (zh) 点云滤噪的方法、测距装置、系统、存储介质和移动平台
WO2022052606A1 (zh) 电子装置、电子装置的控制方法及计算机可读存储介质
US20240159879A1 (en) Detection control method and apparatus
US20230065210A1 (en) Optical distance measuring device
WO2023240619A1 (zh) 测量距离的方法及激光雷达
CN114488173A (zh) 一种基于飞行时间的距离探测方法和系统
WO2023159974A1 (zh) 测距方法、光电探测模组、芯片、电子设备及介质
CN112105944A (zh) 具有使用短脉冲和长脉冲的多模式操作的光学测距系统
EP3709052A1 (en) Object detector
CN109212544A (zh) 一种目标距离探测方法、装置及系统
CN114026461A (zh) 构建点云帧的方法、目标检测方法、测距装置、可移动平台和存储介质
WO2022247554A1 (zh) 清洁机器人的回充方法和清洁机器人系统
WO2020087376A1 (zh) 光探测方法、光探测装置和移动平台
KR102076478B1 (ko) 이동성 거울을 이용한 광 송수신기, 3차원 거리 측정 장치, 및 이동체
WO2020113564A1 (zh) 一种激光接收电路及测距装置、移动平台
EP4206733A1 (en) Laser unit for laser radar, and laser radar
CN114814880A (zh) 一种激光雷达探测参数调整控制方法及装置
CN114236504A (zh) 一种基于dToF的探测系统及其光源调整方法
WO2022036714A1 (zh) 激光测距方法、测距装置和可移动平台

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22928237

Country of ref document: EP

Kind code of ref document: A1