WO2023077412A1 - Procédé et dispositif de mesure de distance d'un objet - Google Patents

Procédé et dispositif de mesure de distance d'un objet Download PDF

Info

Publication number
WO2023077412A1
WO2023077412A1 PCT/CN2021/128943 CN2021128943W WO2023077412A1 WO 2023077412 A1 WO2023077412 A1 WO 2023077412A1 CN 2021128943 W CN2021128943 W CN 2021128943W WO 2023077412 A1 WO2023077412 A1 WO 2023077412A1
Authority
WO
WIPO (PCT)
Prior art keywords
target object
target
distance
image
unwrapping
Prior art date
Application number
PCT/CN2021/128943
Other languages
English (en)
Chinese (zh)
Inventor
罗鹏飞
唐样洋
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN202180099316.9A priority Critical patent/CN117480408A/zh
Priority to PCT/CN2021/128943 priority patent/WO2023077412A1/fr
Publication of WO2023077412A1 publication Critical patent/WO2023077412A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Definitions

  • the present application relates to the technical field of three-dimensional distance measurement, and in particular to a method and device for object distance measurement.
  • the time of flight (TOF) ranging method mainly uses the round-trip flight time of the optical signal sent by the transmitter between the target object and the receiver to calculate the distance from the target object to the TOF ranging system.
  • the time of flight is more difficult.
  • the TOF ranging system can calculate the distance by measuring the phase delay between the transmitted signal and the reflected signal, and use the depth map to represent the distance information corresponding to the surface of the target object.
  • the TOF ranging system can use single-frequency ranging technology.
  • the single-frequency ranging technology means that the TOF ranging system uses a modulation frequency to transmit optical signals and performs ranging based on phase delay. Under this modulation frequency, the maximum distance that can be measured within one signal period is called the ambiguity distance. If the actual distance between the TOF ranging system and the target object exceeds the fuzzy distance, phase winding will occur. The phase winding phenomenon means that the measured value of the phase delay of the reflected signal may be different from the actual value. If the actual value exceeds one signal period, it will be subtracted by N 2 ⁇ (N is a positive integer), so that the measured value falls on the signal Period, so there may be a difference of N ⁇ 2 ⁇ between the actual value and the measured value.
  • the TOF ranging system when measuring the distance of an object exceeding 0.5 meters, the TOF ranging system will calculate the distance as a distance between 0 meters and 0.5 meters due to the phenomenon of phase winding, which causes the measured distance to be different from that of Phenomena where the real distance is different. Therefore, in the single-frequency ranging technology, in order to make the measured value the same as the real value, it is necessary to expand the fuzzy distance as much as possible.
  • the ambiguity distance is inversely correlated with the modulation frequency. In order to expand the measurable range, the modulation frequency can be reduced to increase the ambiguity distance. However, the modulation frequency is also related to the measurement accuracy, and reducing the modulation frequency will increase the measurement error.
  • the embodiment of the present application provides a TOF object ranging method.
  • the single-frequency ranging technology even if the position of the measured object exceeds the fuzzy distance, the real distance to the target object can be measured, and the measurement accuracy can be guaranteed.
  • an embodiment of the present application provides a method for measuring distance from an object, including: using a time-of-flight TOF sensor to acquire a first image and a first phase image corresponding to the first image; identifying a target area corresponding to a target object in the first image , and the second phase map corresponding to the target area in the first phase map; obtaining the estimated distance of the target object from the TOF sensor according to the size information of the target object, wherein the size information includes the image size of the target object in the first image, or, The actual physical size of the target object; calculating the unwrapping coefficient corresponding to each pixel in the target area according to the estimated distance and the second phase map; determining the depth map of the target area according to the unwrapping coefficient and the second phase map.
  • the first image can be a grayscale image or a color image
  • the target area can be understood as a region of interest (ROI) area, and the target area is used for the object ranging device to pay attention to the phase image corresponding to the target object
  • the size information of the object obtains the estimated distance of the target object from the TOF sensor, which can be understood as an approximate distance from the target object to the TOF sensor
  • the unwrapping coefficient is used to indicate the real signal period corresponding to each pixel in the target area
  • the depth map Used to visualize the true distance corresponding to each pixel in the target area.
  • the unwrapping coefficient can indicate the signal cycle corresponding to the target object, and the unwrapping coefficient corresponding to each pixel can be determined, and the real distance corresponding to each pixel can be determined, even if the actual distance of the target object exceeds the fuzzy distance, the target can also be detected The actual distance of the object, and can guarantee the measurement accuracy.
  • the size information of the target object is the image size of the target object
  • the estimated distance between the target object and the TOF sensor is obtained according to the size information of the target object, including: according to the image size of the target object, and the preset relationship Determine the estimated distance, the preset relationship indicates that the multiple first distances have a corresponding relationship with multiple preset image sizes, the image size of the target object is the target preset image size among the multiple preset image sizes, and the estimated distance is the target preset image size. Assume that image sizes have a first distance of correspondence.
  • the image size is the number of pixels occupied by the target object in the first image
  • the object ranging device locally stores the corresponding relationship between multiple first distances and multiple preset image sizes, and the multiple first distances and multiple preset
  • the image size and corresponding relationship are obtained based on empirical values.
  • the object ranging device determines the number of pixels corresponding to the target area, that is, determines the image size of the target object. If the image size of the target object is one of the multiple preset image sizes For the target preset image size, based on the above preset relationship, it is determined that the estimated distance is the first distance having a corresponding relationship with the target preset image size.
  • the single-frequency ranging technology can calculate the possible signal periods corresponding to multiple measured objects ( Corresponding to the possible position), but it is impossible to determine which signal cycle the real position of the measured object corresponds to, which signal cycle the real position of the measured object corresponds to can be represented by the unwrapping coefficient.
  • the unwrapping coefficient can be calculated, that is to say, in the current single-frequency ranging technology, the unwrapping coefficient is not involved, but in this embodiment, the unwrapping technology can be calculated in the single-frequency ranging technology, and the object ranging device
  • the approximate distance between the target object and the object distance measuring device can be judged according to the number (number) of pixels occupied by the target area in the first image, so that the distance corresponding to each pixel in the target area can be calculated according to the approximate distance and the second phase map.
  • the size information of the target object is the actual physical size of the target object
  • the estimated distance between the target object and the TOF sensor is obtained according to the size information of the target object, including: calculating the distance between the target object and the TOF sensor according to the second phase map A plurality of second distances between the sensors, each of the plurality of second distances corresponds to an estimated size of a target object; matching the actual physical size of the target object with the plurality of estimated sizes, and determining the actual physical size The estimated size of the matched target; determining the second distance corresponding to the estimated size of the target as the estimated distance.
  • the object ranging device calculates multiple distances (also called “second distances") between the target object and the TOF sensor according to the second phase map, and the multiple second distances are the distance between the target object and the TOF Multiple possible distances, that is, the possible appearance positions of the target object, the object ranging device can calculate the estimated size of a target object according to each of the multiple second distances, that is, each second distance corresponds to a target The estimated size of the object; the object ranging device matches the actual physical size of the target object with a plurality of estimated sizes, and determines the estimated size of the target matching the actual physical size, wherein the actual physical size of the target object is an empirical value, for example, The category of the target object is a human face, and the actual width of the human face is about 15 cm; the object ranging device determines that the second distance corresponding to the estimated size of the target is the above estimated distance.
  • second distances also called "second distances”
  • the single-frequency ranging technology can calculate the possible signal periods corresponding to multiple measured objects ( Corresponding to the possible position), but it is not possible to determine which signal cycle the real position of the measured object corresponds to. Which signal cycle the real position of the measured object corresponds to can be represented by the unwrapping coefficient.
  • the unwrapping coefficient only in the multi-frequency ranging technology In order to calculate the unwrapping coefficient, that is to say, the current single-frequency ranging technology does not involve the unwrapping coefficient, but in this embodiment, the unwrapping technology can be calculated in the single-frequency ranging technology.
  • the distance device can select a second distance from a plurality of calculated distances (second distances) as an estimated distance according to the actual physical size of the target object, thereby being able to calculate the distance of each pixel in the target area according to the estimated distance and the second phase map.
  • second distances a plurality of calculated distances
  • the solution in this embodiment can save the power consumption of the object distance measuring device.
  • the method before obtaining the estimated distance between the target object and the TOF sensor according to the size information of the target object, the method includes: determining the category of the target object; The category obtains the size information of the target object.
  • the object ranging device can determine the category of the target object through machine vision technology.
  • the category of the target object can be human face, hand, machine part, etc.
  • the size information of the target object can be obtained by querying the database according to the category.
  • the database includes Dimensional information for various classes of objects.
  • the object ranging device can locally store size information of multiple types of objects, so that the above object ranging method can be applied to multiple types of objects, thus making the method suitable for various application scenarios.
  • the category of the target object is a human face.
  • the target area includes a first area and a second area; when there is no phase inversion in phase delays corresponding to multiple pixels in the target area, the unwrapping corresponding to each pixel in the target area
  • the coefficients are the same; or, when the phase delays corresponding to multiple pixels in the target area have phase inversion, the unwrapping coefficients include the first unwrapping coefficient and the second unwrapping coefficient, where the phase inversion refers to the first phase delay and the second phase delay
  • the two phase delays are not in the same signal cycle, the first phase delay is the phase delay corresponding to multiple pixels in the first area, and the second phase delay is the phase delay corresponding to multiple pixels in the second area; the first phase delay is the same as the first phase delay
  • the unwrapping coefficient is related, and the second phase delay is related to the second unwrapping coefficient; the first unwrapping coefficient is used to calculate the depth map corresponding to the first region, and the second unwrapping coefficient is used to calculate the depth map corresponding to the second region.
  • phase delay corresponding to the second area is 365 degrees, it will directly convert 365 degrees into 5 degrees (365 degrees-360 degrees), and calculate the distance according to the phase delay of 5 degrees.
  • 5 degrees (measured value) and 365 degrees (true value) are different, which will cause incoherence in the depth value of the target area.
  • N 1 and N 2 indicate the corresponding depths of the first area and the second area.
  • a signal period, where the signal period can indicate the actual distance corresponding to the first area and the second area, so that the depth of the target area is continuous and close to the above estimated distance.
  • the embodiment of the present application provides an object distance measuring device, including: an acquisition module, configured to use a time-of-flight TOF sensor to acquire a first image and a first phase image corresponding to the first image; a processing module, configured to identify The target area corresponding to the target object in the first image, and the second phase image corresponding to the target area in the first phase image; the processing module is also used to obtain the estimated distance of the target object from the TOF sensor according to the size information of the target object, wherein , the size information includes the image size of the target object in the first image, or the actual physical size of the target object; the processing module is also used to calculate the unwrapping coefficient corresponding to each pixel in the target area according to the estimated distance and the second phase map; The processing module is further configured to determine the depth map of the target area according to the unwrapping coefficient and the second phase map.
  • the size information of the target object is the image size of the target object; the processing module is further configured to determine the estimated distance according to the image size of the target object and a preset relationship, and the preset relationship indicates a plurality of first The distance corresponds to a plurality of preset image sizes, the image size of the target object is the target preset image size in the plurality of preset image sizes, and the estimated distance is the first distance corresponding to the target preset image size.
  • the processing module is further specifically configured to: calculate a plurality of second distances between the target object and the TOF sensor according to the second phase map, and each second distance in the plurality of second distances corresponds to a The estimated size of the target object; matching the actual physical size of the target object with multiple estimated sizes to determine the estimated target size matching the actual physical size; determining the second distance corresponding to the estimated target size as the estimated distance.
  • the processing module is further specifically configured to: determine the category of the target object; and acquire size information of the target object according to the category.
  • the category of the target object is a human face.
  • the target area includes a first area and a second area; when there is no phase reversal in phase delays corresponding to multiple pixels in the target area, the unwrapping coefficients corresponding to each pixel in the target area are the same ;or,
  • the unwrapping coefficient When there is a phase inversion in the phase delays corresponding to multiple pixels in the target area, the unwrapping coefficient includes the first unwrapping coefficient and the second unwrapping coefficient, wherein the phase inversion means that the first phase delay and the second phase delay are not at the same One signal period, the first phase delay is the phase delay corresponding to multiple pixels in the first area, and the second phase delay is the phase delay corresponding to multiple pixels in the second area; the first phase delay is related to the first unwrapping coefficient, The second phase delay is related to the second unwrapping coefficient; the first unwrapping coefficient is used to calculate the depth map corresponding to the first region, and the second unwrapping coefficient is used to calculate the depth map corresponding to the second region.
  • the embodiment of the present application provides an object ranging device, including a processor, the processor is coupled with at least one memory, and the processor is used to read the computer program stored in the at least one memory, so that The electronic device executes the method described in any one of the above first aspects.
  • the embodiment of the present application provides a computer-readable storage medium for storing computer programs or instructions, and when the computer programs or instructions are executed, the computer or processor executes any one of the above-mentioned first aspects. the method described.
  • the embodiment of the present application provides a computer program product including instructions, and when the instructions in the computer program product are executed by a computer or a processor, the computer or processor can realize any one of the above first aspects the method described.
  • FIG. 1 is a schematic diagram of the TOF ranging principle
  • FIG. 2 is a schematic diagram of a depth map
  • Figure 3a is a schematic diagram of a scene where a measured object may appear in the single-frequency ranging technique
  • Fig. 3b is a schematic diagram of the scene where the measured object may appear in the multi-frequency ranging technology
  • FIG. 4 is a schematic structural diagram of a ranging system in an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a transmitted signal and a reflected signal in an embodiment of the present application
  • FIG. 6 is a schematic flow chart of the steps of an object ranging method in the embodiment of the present application.
  • FIG. 7 is a schematic diagram of a target area in the first image and a second phase image corresponding to the target area in the embodiment of the present application;
  • Fig. 8 is a relationship diagram of the number of pixels occupied by faces at different distances in the embodiment of the present application.
  • FIG. 9 is a schematic diagram of a plurality of second distances and the actual physical size of a human face corresponding to each second distance in the embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of an embodiment of an object ranging device in the embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of an embodiment of an electronic device in the embodiment of the present application.
  • plural means two or more than two.
  • At least one of the following or similar expressions refer to any combination of these items, including any combination of single or plural items.
  • at least one item (piece) of a, b, or c can represent: a, b, c, a and b, a and c, b and c, or a and b and c, wherein a, b, c can be single or multiple.
  • words such as “first” and “second” are used to distinguish the same or similar items with basically the same function and effect. Those skilled in the art can understand that words such as “first” and “second” do not limit the number and execution order, and words such as “first” and “second” do not necessarily limit the difference.
  • words such as “exemplary” or “for example” are used to represent examples, illustrations or descriptions. Any embodiment or design solution described as “exemplary” or “for example” in the embodiments of the present application shall not be interpreted as being more preferred or more advantageous than other embodiments or design solutions. Rather, the use of words such as “exemplary” or “such as” is intended to present relevant concepts in a concrete manner for easy understanding.
  • the modulated light source When measuring the distance of the measured object, the modulated light source emits a high-frequency infrared modulation signal to illuminate the target object, and then the modulated signal is reflected back to the sensor of the TOF ranging system surface, a distance-dependent phase delay occurs.
  • the TOF sensor receives and demodulates the phase delay caused by the flight process, and then calculates the distance between the sensor and the target object based on known quantities such as light flight speed and modulation frequency.
  • each pixel unit in the TOF sensor array is required to independently receive and demodulate the distance from each corresponding point on the object surface information, and then obtain the depth map of the target object.
  • a depth image also called a range image, refers to an image in which the distance (depth) of each point on the measured object is collected as a pixel value.
  • the depth map directly reflects the geometry of the visible surface of the measured object. Please refer to FIG. 2, the visualized depth map is shown in FIG. 2, different depth values are mapped to different colors, and thus the depth map is visualized in a color image.
  • TOF ranging includes single-frequency ranging technology and multi-frequency ranging technology.
  • the range of the traditional single-frequency ranging technology is limited. Under the single-frequency modulation frequency, it can only measure the objects within the fuzzy distance (as shown in the following formula (1)) and measure the distance according to the following formula (2). If the measured object If the position exceeds the ambiguity distance, the phenomenon of phase winding will appear. Please refer to Figure 3a for understanding.
  • the single-frequency ranging technology can calculate the possible positions of multiple measured objects, but it cannot determine which signal period the real position of the measured object corresponds to. Among them, which signal corresponds to the real position of the measured object The period can be represented by the unwrapping coefficient.
  • the modulation frequency is reduced, thereby expanding the fuzzy distance, so that the real distance of the measured object can fall within the range of the fuzzy distance, so that the measured object can be accurately measured, but this This method will reduce the measurement accuracy.
  • multi-frequency ranging technology that is, add one or more modulating frequency modulation waves to measure the distance of the measured object in sequence, as shown in Figure 3b
  • each modulating frequency will correspond to a fuzzy distance
  • the real distance is how much
  • the beating frequency will generally be lower, allowing longer measurement distances to be extended.
  • modulation frequency a 100MHz
  • the ambiguity distance U 100MHz corresponding to modulation frequency a 1.5m
  • modulation frequency b 80MHz
  • the ambiguity distance U 80MHz corresponding to modulation frequency b 1.875m
  • the multi-frequency ranging technology can accurately measure the real distance of the measured object
  • the multi-frequency ranging technology will increase the power consumption of the ranging system, reduce the frame rate, and limit the application scenarios of the ranging system.
  • c is the speed of light
  • F is the modulation frequency
  • the ranging system includes a TOF module 40 and a processing module 41, wherein the TOF module 40 includes a TOF chip 403, a laser 402 and a lens 404 (including a filter).
  • the TOF chip 403 includes a ToF sensor 4033 , a controller 4031 and an analog-to-digital converter (analog-to-digital converter, ADC) 4032 .
  • the controller 4031 is used for generating a transmission signal of a certain modulation frequency.
  • a measure of a signal waveform change usually in degrees (angles).
  • the emitted infrared continuous optical signal i.e. the transmitted signal
  • the transmitted signal also called "light wave signal”
  • t represents the time
  • represents the angular velocity
  • ⁇ t represents the angle
  • the controller 4031 sends electrical signals to the laser driver 401 and the TOF sensor 4033 respectively.
  • the laser driver 401 is used to drive the laser 402 according to the change of the electrical signal, and the laser 402 is used to emit the corresponding optical wave signal s(t).
  • the light wave signal is reflected by the measured object, the lens 404 is used to receive the reflected signal and send the reflected signal to the TOF sensor, the TOF sensor 4033 is used to receive the reflected signal g(t), and the reflected signal g(t) contains phase delay
  • the modulated cosine signal and the reflected signal g(t) can be expressed by the following formula (4).
  • phase delay represents the propagation delay of light waves during flight.
  • the TOF sensor 4033 is also used for mixing and integrating s(t) and g(t) (as shown in the following formula 5) to obtain the integral value, which is output to the processing module 41 through the ADC4032, and the processing module 41 is used to process the integral value Phase calculation and intensity calculation, so as to obtain the grayscale image and phase image of the measured object.
  • s(t) represents the transmitted signal
  • g(t) represents the reflected signal
  • the 4 different ⁇ values are respectively substituted into the above formula (5), A four-phase integrated value (also called 4-phase bare data) is obtained.
  • the ADC4032 outputs the four-phase integral value to the processing module 41 .
  • the processing module 41 is used to perform phase calculation (as shown in the following formula (6)) and intensity calculation (as shown in the following formula (7) and formula (8)) according to the four-phase bare data.
  • the phase delay in the following equation (6) For obtaining the phase image of the object under test, the intensity B in the following formula (7) can be used to obtain the grayscale image of the object under test.
  • may be 0° or 90°.
  • the processing module 41 is used to identify the target area corresponding to the target object in the grayscale image, and the phase image corresponding to the target area; the processing module 41 can obtain the estimated distance from the target object to the TOF sensor 4033 according to the size information of the target object, and according to the estimated distance and The phase map calculates the unwrapping coefficient corresponding to each pixel in the target area, determines the unwrapping coefficient, that is, determines the signal period corresponding to each pixel in the target area, and then can determine the target according to the unwrapping coefficient and the phase map corresponding to the target area Depth map of the region.
  • the unwrapping coefficient corresponding to each pixel in the target area After determining the unwrapping coefficient corresponding to each pixel, the real distance corresponding to each pixel can be determined. Even if the actual distance of the target object exceeds the fuzzy distance, the target can also be detected The actual distance of the object, and can guarantee the measurement accuracy.
  • An object ranging method provided in an embodiment of the present application may be executed by an object ranging device.
  • the structure of the object ranging device is as shown in FIG. 4 above.
  • the object ranging device includes a TOF module and a processing module.
  • the TOF module and the processing module are integrated in the same in the device.
  • the object distance measuring device is a dedicated distance measuring device, or the object distance measuring device may be a terminal. Terminals include but are not limited to mobile phones, computers, tablet computers, vehicle terminals, smart homes (such as smart TVs), game devices, robots, etc.
  • the TOF module and the processing module can be respectively set in different devices.
  • the TOF module is set in the TOF camera
  • the processing module is set in the object distance measuring device
  • the TOF camera and the object distance measuring device are connected in communication
  • the object distance measuring device can be a terminal
  • the object distance measuring device can also be a terminal
  • the terminals include but are not limited to mobile phones, computers, tablet computers, vehicle terminals, smart homes (such as smart motors), game devices, robots, etc.
  • the object ranging device performs the following steps 601 to 605 .
  • Step 601. The object ranging device acquires a first image and a first phase map corresponding to the first image by using a TOF sensor.
  • the object ranging device uses the TOF sensor to acquire the first image.
  • the first image is a grayscale image of the scene to be photographed.
  • the first image is obtained by using the above formula (7) or the above formula (8).
  • the object ranging device further includes a color image sensor, the first image may also be an RGB (red, green, blue) image.
  • the object ranging device is a mobile phone, the mobile phone includes a color image sensor, and the first image is an RGB image collected by the mobile phone through the color image sensor.
  • the object ranging device uses the TOF sensor to obtain raw data (or called raw data), and calculates the gray value corresponding to each pixel in the first image and the phase delay corresponding to the light wave flight time according to the raw data, and then obtains the corresponding phase delay of the first image.
  • the first phase diagram of Based on the principle of TOF ranging, when measuring the distance of the target object, the modulated light source emits a high-frequency infrared modulation signal to irradiate the target object, and then when the modulated signal is reflected back to the surface of the TOF sensor, a distance-related phase delay occurs.
  • Each pixel in the first image corresponds to a phase delay in the first phase map, and the first phase map can indicate the distance information between the target object and the object distance measuring device (refer to the above formula (2) for understanding).
  • Step 602 the object ranging device identifies a target area corresponding to the target object in the first image, and a second phase image corresponding to the target area in the first phase image.
  • FIG. 7 is a schematic diagram of a target area of the first image and a second phase map corresponding to the target area.
  • the object ranging device identifies a target area in the first image, where the target area is a position corresponding to the target object in the first image.
  • the object ranging device confirms the target area in the first image, and determines the second phase map corresponding to the target area in the first phase map according to the position of the target area.
  • the object distance measuring device has a built-in object recognition model, such as the object recognition model can be a neural network model, and the object distance measuring device uses the object recognition model to identify the target object and mark the target object location information.
  • the target area can also be the focus area of the TOF sensor.
  • the TOF sensor receives the light waves reflected by the target object to perform automatic focusing.
  • the target object can pass through one side
  • the area enclosed by the frame is the focus area.
  • the target area can be understood as a region of interest (region of interest, ROI).
  • the ROI area is an image area selected from an image, and this area is a key area for image analysis.
  • Step 603 the object ranging device obtains the estimated distance of the target object from the TOF sensor according to the size information of the target object, wherein the size information includes the image size of the target object in the first image, or the actual physical size of the target object.
  • the size information of the target object is the image size of the target object.
  • the category of the target object is not limited.
  • the category of the target object is illustrated by taking a human face as an example.
  • FIG. 8 is a schematic diagram of the relationship between the number of pixels occupied by faces at different distances (the first distance).
  • Figure 8 shows the number of pixels occupied by human faces captured by adults and children at different distances (first distance) from a certain camera with all parameters determined.
  • the first distance is the multiple distances shown on the horizontal axis in FIG. 8
  • the multiple preset image sizes are the multiple sizes shown on the vertical axis in FIG. 8 .
  • the number of pixels occupied by the face is the image size of the face.
  • the approximate position of the face can be roughly judged by the number of pixels occupied by the face.
  • the object ranging device determines the estimated distance according to the image size of the target object and a preset relationship.
  • the estimated distance can be understood as an approximate distance.
  • the object ranging device stores size information in advance, and the size information includes a plurality of first distances and a plurality of preset image sizes, and a one-to-one correspondence between the plurality of first distances and the plurality of preset image sizes.
  • the image size of the target object is a target preset image size among the multiple preset image sizes, and the estimated distance is a first distance corresponding to the target preset image size.
  • image size horizontal H (horizontal) pixels ⁇ vertical V (vertical).
  • the target object takes a child's face as an example.
  • the image size is 100 pixels, and the first distance corresponding to 100 pixels is 0.6m. That is, when the image size of the target object is 100 pixels, it can be obtained according to the preset relationship The estimated distance is 0.6m.
  • the object ranging device may determine an approximate distance (estimated distance) between the human face and the ranging device according to the number of pixels occupied by the human face in the first image.
  • the size information of the target object is the actual physical size of the target object.
  • the category of the target object is illustrated by taking a human face as an example.
  • the object ranging device calculates a plurality of distances (also referred to as "second distances") between the target object and the TOF sensor according to the second phase map, and each second distance in the plurality of second distances corresponds to an estimate of a target object size.
  • the object distance measuring device performs calculations based on the second phase diagram. Since there may be a phase winding phenomenon, that is, it is impossible to determine which signal period the actual distance of the target object corresponds to, multiple different distances will be calculated for different signal periods.
  • the object ranging device restores and calculates three distances from the second phase image, and the positions corresponding to these three distances are positions where human faces may appear. Please refer to Fig. 9 for understanding, the positions where faces may appear are 25cm, 75cm and 125cm respectively.
  • the object ranging device calculates the corresponding three human face sizes according to the three distances, and the human face widths (estimated sizes) can be calculated to be 5cm, 15cm and 25cm respectively.
  • the estimated width of the corresponding face is 5cm; when the face position is 75cm, the estimated width of the corresponding face is 15cm; when the face position is 125cm, the estimated width of the corresponding face is 25cm.
  • the field of view of the camera is known.
  • the field of view is m degrees.
  • the width of the first image occupies p pixels, and the width of the face occupies q pixels. Therefore, FIG. 9
  • the middle ⁇ angle is calculated according to the following formula (9).
  • the estimated width (estimated size) of the human face can be calculated through a trigonometric calculation formula. It should be noted that the number of the second distances here is just an example for convenience of description, and the number of the second distances may be more in a process of actual application.
  • the object ranging device matches the actual physical size of the target object with a plurality of estimated sizes to determine the estimated size of the target that matches the actual physical size.
  • the actual physical size of the human face is 15cm (the approximate width of the human face), and the actual physical size of the target object is matched with multiple estimated sizes (5cm, 15cm and 25cm), then the target estimated size It is 15cm.
  • the object ranging device determines that the second distance corresponding to the estimated size of the target is the estimated distance D E .
  • the target estimated size (15cm) is the size closest to the real face width (that is, the actual physical size of the face), so 75cm corresponding to 15cm is selected as the estimated distance.
  • the object ranging device can restore the possible position of the target object according to the second phase map, and can calculate the estimated size of the target object based on multiple possible positions, and further, select An estimated size (target estimated size) that matches (closest to) the actual physical size of the target object, and the second distance corresponding to the estimated target size is used as the estimated distance.
  • Step 604 the object ranging device calculates the unwrapping coefficient corresponding to each pixel in the target area according to the estimated distance and the second phase map.
  • the unwrapping coefficient is used to indicate the signal period corresponding to the actual position of the target object. Please refer to FIG. 3a for understanding.
  • the traditional single-frequency ranging technology can calculate the possible positions of multiple measured objects, but cannot determine which signal period the real position of the measured object corresponds to.
  • which signal cycle corresponds to the real position of the measured object can be represented by the unwrapping coefficient. Since the maximum distance that the object distance measuring device can measure within one signal period is an ambiguity distance, the unwrapping coefficient can also be understood as: used to indicate which ambiguity distance the actual position of the target object is within.
  • the object ranging device calculates the unwrapping coefficient according to the following formula (10):
  • N represents the unwrapping coefficient
  • round represents rounding calculation
  • D E represents the estimated distance
  • U represents the fuzzy distance
  • the human face since the human face has depth, there may be a phase reversal in the phase delay corresponding to the pixels in the target area.
  • the target area corresponding to the face in the first image includes the first area and the second area, and the first area and the second area are spatially continuous without depth jump, that is to say, in the grayscale image (RGB image)
  • RGB image grayscale image
  • Phase inversion means that the first phase delay and the second phase delay are not in the same signal cycle, the first phase delay is the phase delay corresponding to multiple pixels in the first area, and the second phase delay is the phase delay corresponding to multiple pixels in the second area phase delay.
  • the first area is the area corresponding to the nose tip area of the human face
  • the second area is the area corresponding to the ears of the human face. The position of the nose tip is closer to the object distance measuring device than the position of the ear.
  • a signal period is [0 degrees, 360 degrees]
  • the phase delay corresponding to at least one pixel in the nose tip area is 355 degrees
  • the phase delay corresponding to at least one pixel in the ear area is 365 degrees, that is, the phase delay corresponding to the nose tip area and the ear If the phase delay corresponding to the region is not within one signal period, it indicates that there is a phase inversion in the target region.
  • Case 1 When the phase delays corresponding to multiple pixels in the target area do not have phase inversion, the unwrapping coefficients corresponding to each pixel in the target area are the same. In this case, the unwrapping coefficients corresponding to all the pixels in the target area are the same value, and the depth values of multiple pixels in the target area are close to the estimated distance D E .
  • the unwrapping coefficients include a first unwrapping coefficient and a second unwrapping coefficient.
  • the first phase delay is related to the first unwrapping coefficient
  • the second phase delay is related to the second unwrapping coefficient.
  • the first unwrapping coefficient is used to calculate the depth map corresponding to the first region
  • the second unwrapping coefficient is used to calculate the depth map corresponding to the second region.
  • the first phase delay of 355 degrees is substituted into the above (10) to obtain the first unwrapping coefficient N 1
  • the phase delay of 5 degrees is substituted into the above formula (10)
  • the second unwrapping coefficient N 2 is obtained.
  • N1 and N2 indicate the real positions corresponding to the first area and the second area.
  • Step 605 the object ranging device determines the depth map of the target area according to the unwrapping coefficient and the second phase map.
  • the object ranging device calculates the actual distance corresponding to each pixel in the target area according to the unwrapping coefficient and the second phase map.
  • the object ranging device can calculate the actual distance corresponding to each pixel according to the following formula (11).
  • the entire object area uses the same unwrapping coefficient to calculate the actual distance corresponding to each pixel.
  • the first area uses the first unwrapping coefficient to calculate the actual distance corresponding to each pixel in the first area
  • the second area uses the second unwrapping coefficient to calculate the actual distance corresponding to each pixel in the second area , that is, different unwrapping coefficients are used in different regions within the target region.
  • N 1 0 indicates that the tip of the nose corresponds to the first signal period
  • N1 and N2 indicate the corresponding real positions of the first area and the second area.
  • the object ranging device will not calculate the unwrapping coefficient. If the phase delay corresponding to the ear is 365 degrees, it will directly convert 365 degrees into 5 degrees (365 degrees-360 degrees), according to The above formula (2) is used to calculate the distance, and the corresponding distances of 5 degrees and 365 degrees are different, thus, the depth value of the face area will be incoherent, and in this embodiment, N1 and N2 indicate the first The two signal periods corresponding to the area and the second area can indicate the real positions corresponding to the first area and the second area, so that a coherent depth map can be finally generated based on the target area corresponding to the face.
  • the object ranging device can calculate the actual distance corresponding to each pixel in the target area.
  • further data processing is required.
  • the three-dimensional coordinates corresponding to each pixel are calculated according to the two-dimensional coordinates of each pixel and the real distance D r corresponding to each pixel, please refer to the following formulas (12) to (14).
  • (u i , v i ) represents the i-th pixel point
  • u 0 , v 0 represent the coordinates of the optical center in the image coordinate system
  • f x represents the focal length in the horizontal direction
  • f y represents the focal length in the vertical direction
  • (X w , Y w , Z w ) represent coordinates in the world coordinate system.
  • the ranging device obtains the approximate distance (estimated distance) between the target object and the TOF sensor according to the size information of the target object, and can calculate the solution corresponding to each pixel in the target area according to the estimated distance and the phase map.
  • the unwrapping coefficient and unwrapping coefficient can indicate the signal cycle of each pixel corresponding to the target object.
  • the real distance corresponding to each pixel can be determined, that is, the depth map of the target area can be obtained.
  • the depth map of the target area corresponding to the target object can be obtained without reducing the modulation frequency to increase the blur distance, thereby ensuring the ranging accuracy.
  • the above-mentioned target objects are illustrated by taking human faces as an example.
  • the category of the target object is not limited. In different application scenarios, the target object may belong to different categories.
  • the above method may further include the following steps.
  • the object ranging device uses machine vision technology to identify the category to which the target object belongs, and queries the object size information corresponding to each category that has been stored in the database according to the category to which the target object belongs.
  • the category of the target object is a human face
  • the object ranging device queries the size information of the human face in the database.
  • the size information includes the size information in step 603 in the above embodiment
  • the size information a includes multiple first distances a and multiple preset face image sizes, and multiple first distances a and Correspondence of multiple preset face image sizes.
  • the size information of the human face is the actual physical size of the human face (eg, the width of the human face is 15 cm).
  • the object ranging device is a vehicle-mounted terminal.
  • the vehicle-mounted terminal can accurately detect the position of the driver's head to judge whether the driver is focused enough and whether he is driving fatigued, so as to initiate corresponding countermeasures.
  • the object ranging device is a mobile terminal (such as a mobile phone, a tablet computer, etc.). Compared with the traditional 2D face recognition technology, the mobile terminal recognizes 3D faces, and the bioactivity detection is accurate and safe, so that the embodiments of the present application
  • the object ranging method in the paper is applied to face unlocking, mobile payment application scenarios, etc.
  • the object ranging device queries the size information of the hand in the database.
  • the size information includes size information b
  • the size information b includes a plurality of first distances b and a plurality of preset hand image sizes, and a plurality of first distances b and a plurality of preset hand image sizes corresponding relationship.
  • the size information is the actual physical size of the hand.
  • the object ranging device is a vehicle-mounted terminal.
  • the object ranging device detects various positions of the user's hand during the movement process to control the vehicle entertainment system or the vehicle air conditioner.
  • the object distance measuring device is a smart TV or a computer. The object distance measuring device detects the position of the user's hand and outputs a corresponding game interface to realize human-computer interaction.
  • the type of the target object is a machine part
  • the object ranging device queries the size information of the machine part in the database.
  • the size information includes size information c
  • the size information c includes a plurality of first distances c and a plurality of preset machine part image sizes, and a plurality of first distances c and a plurality of preset machine part image sizes corresponding relationship.
  • the size information is the actual physical size of the machine parts.
  • the object distance measuring device is an industrial robot, and the object distance measuring device detects the position of the machine parts, so that the machine parts can be accurately positioned and placed.
  • an embodiment of the present application provides an object distance measuring device 1000 , which is used to execute the method performed by the object distance measuring device in the foregoing method embodiments.
  • the object ranging device includes an acquisition module 1001 and a processing module 1002 .
  • An acquisition module 1001 configured to acquire a first image and a first phase map corresponding to the first image by using a time-of-flight TOF sensor;
  • a processing module 1002 configured to identify a target area corresponding to the target object in the first image, and a second phase image corresponding to the target area in the first phase image;
  • the processing module 1002 is further configured to acquire the estimated distance of the target object from the TOF sensor according to the size information of the target object, wherein the size information includes the image size of the target object in the first image, or the actual physical size of the target object;
  • the processing module 1002 is further configured to calculate an unwrapping coefficient corresponding to each pixel in the target area according to the estimated distance and the second phase map;
  • the processing module 1002 is further configured to determine a depth map of the target area according to the unwrapping coefficient and the second phase map.
  • the function of the obtaining module 1001 may be performed by a transceiver.
  • the transceiver has the function of sending and/or receiving.
  • the transceiver is replaced by a receiver and/or transmitter.
  • the function of the obtaining module 1001 may also be executed by a processor.
  • the processing module 1002 is a processor, and the processor is a general-purpose processor or a special-purpose processor.
  • the processor includes a transceiver unit configured to implement receiving and sending functions.
  • the transceiver unit is a transceiver circuit, or an interface, or an interface circuit.
  • the transceiver circuits, interfaces or interface circuits for realizing the functions of receiving and sending are deployed separately, and optionally integrated together.
  • the above-mentioned transceiver circuit, interface or interface circuit is used for reading and writing codes or data, or the above-mentioned transceiver circuit, interface or interface circuit is used for signal transmission or transmission.
  • the acquiring module 1001 is configured to execute step 601 in the above method embodiment
  • the processing module 1002 is configured to execute step 602 to step 605 in the above method embodiment.
  • the size information of the target object is the image size of the target object
  • the processing module 1002 is further configured to determine the estimated distance according to the image size of the target object and a preset relationship, the preset relationship indicates that multiple first distances have a corresponding relationship with multiple preset image sizes, and the image size of the target object is multiple
  • the estimated distance of the target preset image size in the preset image size is the first distance corresponding to the target preset image size.
  • processing module 1002 is also specifically configured to:
  • the second distance corresponding to the estimated size of the target is determined as the estimated distance.
  • processing module 1002 is also specifically configured to:
  • the category of the target object is a human face.
  • the target area includes a first area and a second area
  • the unwrapping coefficient includes the first unwrapping coefficient and the second unwrapping coefficient, wherein, the phase inversion means that the first phase delay and the second phase delay are not in the same signal period, and the first phase delay is corresponding to multiple pixels in the first area
  • the second phase delay is the phase delay corresponding to multiple pixels in the second area; the first phase delay is related to the first unwrapping coefficient, and the second phase delay is related to the second unwrapping coefficient; the first unwrapping coefficient It is used to calculate the depth map corresponding to the first region, and the second unwrapping coefficient is used to calculate the depth map corresponding to the second region.
  • the embodiment of the present application provides an object distance measuring device 1100, the object distance measuring device is used to execute the method performed by the object distance measuring device in the above method embodiment, please refer to the above method embodiment for details illustrate.
  • Object ranging devices include but are not limited to mobile phones, computers, tablet computers, vehicle terminals, smart homes (such as smart TVs), game devices, robots, etc.
  • a mobile phone is taken as an example of an object distance measuring device for description.
  • the object ranging device 1100 includes a processor 1101 , a memory 1102 , an input unit 1103 , a display unit 1104 , a TOF module 1105 , a communication unit 1106 , an audio circuit 1107 and other components.
  • the memory 1102 can be used to store software programs and modules, and the processor 1101 executes various functional applications and data processing of the device by running the software programs and modules stored in the memory 1102 .
  • the memory 1102 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage devices.
  • the processor 1101 includes, but is not limited to, various types of processors, such as one or more of a CPU, a DSP, and an image signal processor.
  • the processor 1101 is configured to execute the functions executed by the processing modules in FIG. 5 above.
  • the input unit 1103 can be used to receive inputted number or character information, and generate key signal input related to user settings and function control of the device.
  • the input unit 1103 may include a touch panel 1131 .
  • the touch panel 1131 also referred to as a touch screen, can collect touch operations of the user on or near it (for example, the user uses any suitable object or accessory such as a finger or a stylus on the touch panel 1131 or near the touch panel 1131). operate).
  • the display unit 1104 can be used to display various image information.
  • the display unit 1104 may include a display panel 1141.
  • the display panel 1141 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • the touch panel 1131 can be integrated with the display panel 1141 to realize the input and output functions of the device.
  • the TOF module 1105 is used to generate a transmission signal of a certain frequency, receive the reflection signal reflected by the object under test (target object), generate raw data according to the transmission signal and the reflection signal, and transmit the raw data to the processor 1101 .
  • the processor 1101 is configured to generate a grayscale image and a phase image according to the raw data, and generate a depth image corresponding to the target object according to the grayscale image and the phase image.
  • the TOF module 1105 please refer to the structure of the TOF module 40 in FIG. 5 above for understanding.
  • the communication unit 1106 is configured to establish a communication channel, so that the object distance measuring device is connected to a remote server through the communication channel, and obtain an object detection model and a scene recognition model from the remote server.
  • the communication unit 1106 may include communication modules such as a wireless local area network module, a bluetooth module, a baseband module, and a radio frequency (radio frequency, RF) circuit corresponding to the communication module, for performing wireless local area network communication, Bluetooth communication, infrared communication and/or cellular communication. Communication system communication.
  • the communication module is used to control the communication of each component in the object ranging device, and can support direct memory access.
  • various communication modules in the communication unit 1106 generally appear in the form of integrated circuit chips, and can be selectively combined without including all communication modules and corresponding antenna groups.
  • the communication unit 1106 may only include a baseband chip, a radio frequency chip and corresponding antennas to provide communication functions in a cellular communication system.
  • the object ranging device Via the wireless communication connection established by the communication unit 1106, the object ranging device can be connected to a cellular network or the Internet.
  • Audio circuitry 1107, speaker 1108, and microphone 1109 may provide an audio interface between the user and the handset.
  • the audio circuit 1107 can transmit the electrical signal converted from the received audio data to the speaker 1108, and the speaker 1108 converts it into an audio signal for output.
  • the microphone 1109 converts the collected sound signal into an electrical signal, which is converted into audio data after being received by the audio circuit 1107, and then the audio data is processed by the output processor 1101, and then sent to, for example, another mobile phone through the communication unit 1106, or the audio data Output to memory 1102 for further processing.
  • the object ranging device may also include more components, such as a camera, a power supply, etc., which will not be described in detail.
  • the embodiment of the present application also provides a computer-readable storage medium for storing computer programs or instructions.
  • the computer programs or instructions When executed, the computer executes the method of the object distance measuring device in the above method embodiments.
  • An embodiment of the present application provides a chip, and the chip includes a processor and a communication interface, where the communication interface is, for example, an input/output interface, a pin, or a circuit.
  • the processor is used to read instructions to execute the method executed by the object ranging device in the above method embodiment.
  • the embodiment of the present application also provides a computer program product.
  • the computer program product includes computer program code.
  • the computer program code When executed by a computer or a processor, the computer or processor implements the object distance measuring device in the above method embodiment. Methods.
  • a computer program product includes one or more computer instructions. When the computer program instructions are loaded or executed on the computer, the processes or functions according to the embodiments of the present application will be generated in whole or in part.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable devices.
  • Computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, e.g. Coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) to another website site, computer, server or data center.
  • DSL digital subscriber line
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center that includes one or more sets of available media.
  • the available media may be magnetic media (eg, floppy disk, hard disk, magnetic tape), optical media (eg, DVD), or semiconductor media.
  • the semiconductor medium may be a solid state drive (SSD).

Abstract

L'invention concerne un procédé et un dispositif de mesure de distance d'un objet. Le procédé consiste : à acquérir une première image et une première carte de phase correspondant à la première image en utilisant un capteur de temps de vol (ToF) (4033) (601) ; à reconnaître une zone cible correspondant à un objet cible dans la première image, et à déterminer, sur la base de la zone cible, une seconde carte de phase correspondant à la zone cible dans la première carte de phase (602), de façon à prêter attention à une carte de phase correspondant à l'objet cible ; à acquérir, en fonction des informations sur la taille de l'objet cible, une distance estimée de l'objet cible par rapport au capteur de ToF (4033) (603) ; à calculer, en fonction de la distance estimée et de la seconde carte de phase, un coefficient de déroulement correspondant à chaque pixel dans la zone cible (604), le coefficient de déroulement indiquant une période de signal réelle au niveau de laquelle chaque pixel est situé ; et à déterminer une carte de profondeur de la zone cible en fonction du coefficient de déroulement et de la seconde carte de phase (605), la carte de profondeur étant apte à visualiser la distance réelle correspondant à chaque pixel. La distance réelle de l'objet cible peut être mesurée dans la technologie de mesure de distance à fréquence unique, et la précision de mesure peut être garantie.
PCT/CN2021/128943 2021-11-05 2021-11-05 Procédé et dispositif de mesure de distance d'un objet WO2023077412A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202180099316.9A CN117480408A (zh) 2021-11-05 2021-11-05 一种物体测距方法及装置
PCT/CN2021/128943 WO2023077412A1 (fr) 2021-11-05 2021-11-05 Procédé et dispositif de mesure de distance d'un objet

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/128943 WO2023077412A1 (fr) 2021-11-05 2021-11-05 Procédé et dispositif de mesure de distance d'un objet

Publications (1)

Publication Number Publication Date
WO2023077412A1 true WO2023077412A1 (fr) 2023-05-11

Family

ID=86240416

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/128943 WO2023077412A1 (fr) 2021-11-05 2021-11-05 Procédé et dispositif de mesure de distance d'un objet

Country Status (2)

Country Link
CN (1) CN117480408A (fr)
WO (1) WO2023077412A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117011365A (zh) * 2023-10-07 2023-11-07 宁德时代新能源科技股份有限公司 尺寸测量方法、装置、计算机设备和存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110109133A (zh) * 2019-05-05 2019-08-09 武汉市聚芯微电子有限责任公司 基于飞行时间的距离补偿方法和测距方法
US20200090355A1 (en) * 2018-09-14 2020-03-19 Facebook Technologies, Llc Depth measurement assembly with a structured light source and a time of flight camera
CN111708032A (zh) * 2019-03-18 2020-09-25 英飞凌科技股份有限公司 使用编码调制相位图像帧解决距离测量模糊
US20200363198A1 (en) * 2019-05-13 2020-11-19 Lumileds Llc Depth sensing using line pattern generators
US20210004937A1 (en) * 2019-07-02 2021-01-07 Microsoft Technology Licensing, Llc Machine-learned depth dealiasing
CN112986972A (zh) * 2019-12-13 2021-06-18 华为技术有限公司 探测物体位置的方法和装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200090355A1 (en) * 2018-09-14 2020-03-19 Facebook Technologies, Llc Depth measurement assembly with a structured light source and a time of flight camera
CN111708032A (zh) * 2019-03-18 2020-09-25 英飞凌科技股份有限公司 使用编码调制相位图像帧解决距离测量模糊
CN110109133A (zh) * 2019-05-05 2019-08-09 武汉市聚芯微电子有限责任公司 基于飞行时间的距离补偿方法和测距方法
US20200363198A1 (en) * 2019-05-13 2020-11-19 Lumileds Llc Depth sensing using line pattern generators
US20210004937A1 (en) * 2019-07-02 2021-01-07 Microsoft Technology Licensing, Llc Machine-learned depth dealiasing
CN112986972A (zh) * 2019-12-13 2021-06-18 华为技术有限公司 探测物体位置的方法和装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117011365A (zh) * 2023-10-07 2023-11-07 宁德时代新能源科技股份有限公司 尺寸测量方法、装置、计算机设备和存储介质
CN117011365B (zh) * 2023-10-07 2024-03-15 宁德时代新能源科技股份有限公司 尺寸测量方法、装置、计算机设备和存储介质

Also Published As

Publication number Publication date
CN117480408A (zh) 2024-01-30

Similar Documents

Publication Publication Date Title
CN105026955B (zh) 用于对来自距离感应相机的数据进行降噪的方法和装置
TWI706152B (zh) 用於距離量測及/或多維成像之光電模組,以及獲得距離或三維資料之方法
US9432593B2 (en) Target object information acquisition method and electronic device
WO2020168742A1 (fr) Procédé et dispositif de positionnement d'un corps de véhicule
JP2021516401A (ja) データ融合方法及び関連装置
WO2022126427A1 (fr) Procédé de traitement de nuage de points, appareil de traitement de nuage de points, plateforme mobile, et support de stockage informatique
US20220277467A1 (en) Tof-based depth measuring device and method and electronic equipment
JP2017528621A (ja) 不規則状物体の採寸用のハンドヘルドマルチセンサシステム
CN206348456U (zh) 一种固态面阵探测装置
CN110494827A (zh) 对虚拟现实系统中的对象的位置和定向的跟踪
CN111045029A (zh) 一种融合的深度测量装置及测量方法
CN113160328A (zh) 一种外参标定方法、系统、机器人和存储介质
CN103412318A (zh) 一种便携式红外目标定位仪及定位控制方法
WO2023077412A1 (fr) Procédé et dispositif de mesure de distance d'un objet
US20220128659A1 (en) Electronic device including sensor and method of operation therefor
US20180143317A1 (en) Positioning device and positioning method
CN106772408A (zh) 一种固态面阵探测装置及探测方法
CN113052890A (zh) 一种深度真值获取方法、装置、系统及深度相机
CN109032354B (zh) 电子装置及其手势识别方法、计算机可读存储介质
WO2022228461A1 (fr) Procédé et système d'imagerie ultrasonore tridimensionnelle faisant appel à un radar laser
CN114119696A (zh) 深度图像的获取方法及装置、系统、计算机可读存储介质
CN109000565B (zh) 一种测量方法、装置及终端
CN112987022A (zh) 测距方法及装置、计算机可读介质和电子设备
CN112630750A (zh) 传感器标定方法和传感器标定装置
WO2021168847A1 (fr) Puce, appareil et procédé de télémétrie à base de temps de vol (tof)

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21962926

Country of ref document: EP

Kind code of ref document: A1