WO2021195886A1 - Procédé de détermination de distance, plateforme mobile, et support de stockage lisible par ordinateur - Google Patents

Procédé de détermination de distance, plateforme mobile, et support de stockage lisible par ordinateur Download PDF

Info

Publication number
WO2021195886A1
WO2021195886A1 PCT/CN2020/082199 CN2020082199W WO2021195886A1 WO 2021195886 A1 WO2021195886 A1 WO 2021195886A1 CN 2020082199 W CN2020082199 W CN 2020082199W WO 2021195886 A1 WO2021195886 A1 WO 2021195886A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
image
target
target object
movable platform
Prior art date
Application number
PCT/CN2020/082199
Other languages
English (en)
Chinese (zh)
Inventor
刘洁
周游
覃政科
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202080005139.9A priority Critical patent/CN112771575A/zh
Priority to PCT/CN2020/082199 priority patent/WO2021195886A1/fr
Publication of WO2021195886A1 publication Critical patent/WO2021195886A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Definitions

  • This application relates to the technical field of distance measurement, in particular to a distance determination method, a movable platform, and a computer-readable storage medium.
  • the mobile platform is equipped with a Time Of Flight (TOF) ranging device.
  • the TOF ranging device can emit light pulses from the transmitter to the object, and the receiver can calculate the light pulse from the transmitter to the object, and then The pixel format returns to the runtime of the receiver to determine the distance between the movable platform and the target object.
  • the distance measured by the TOF distance measuring device is more accurate, but in scenes with highly reflective objects, such as road signs, the TOF distance measuring device will experience periodic aliasing, resulting in the incorrect distance measured by the TOF distance measuring device. Accurate, easy to bring some influence to the movable platform, for example, the distance measured by the TOF distance measuring device for the movable platform to avoid obstacles.
  • the movable platform will not be able to Accurate obstacle avoidance cannot guarantee the safety of the movable platform. Therefore, how to accurately measure the distance between the movable platform and the object is a problem to be solved urgently.
  • the present application provides a method for determining a distance, a movable platform, and a computer-readable storage medium, aiming to accurately measure the distance between the movable platform and an object.
  • the present application provides a method for determining distance, which is applied to a movable platform, wherein the movable platform includes a vision sensor and a TOF ranging device, and the TOF ranging device includes a transmitter for emitting light signals.
  • the method includes:
  • the target distance between the movable platform and the target object is determined according to the plurality of second distances and the first distance.
  • the present application also provides a movable platform including a vision sensor, a TOF distance measuring device, a memory, and a processor, and the processor is connected to the vision sensor and the TOF distance measuring device;
  • the TOF distance measuring device includes a transmitting device for transmitting light signals and a receiving device for receiving light signals reflected by a target object;
  • the memory is used to store a computer program
  • the processor is configured to execute the computer program and, when executing the computer program, implement the following steps:
  • the target distance between the movable platform and the target object is determined according to the plurality of second distances and the first distance.
  • the present application also provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the processor realizes what is provided in the specification of this application. Steps of any distance determination method.
  • the embodiment of the present application provides a method for determining a distance, a movable platform, and a computer-readable storage medium.
  • the first distance between the movable platform and a target object is determined by the TOF distance measuring device, and the distance of the target object is output by the visual sensor.
  • the first image and the second image determine the second distance between the movable platform and the multiple spatial points on the target object, and then determine the target between the movable platform and the target object according to the multiple second distances and the first distance distance. Since the distance measurement result between the TOF distance measuring device and the vision sensor is comprehensively considered, the distance between the movable platform and the object can be accurately measured.
  • FIG. 1 is a schematic structural diagram of a movable platform provided by an embodiment of the present application
  • FIG. 2 is a schematic flowchart of steps of a method for determining a distance provided by an embodiment of the present application
  • FIG. 3 is a schematic diagram of a scene in which the TOF distance measuring device and the vision sensor in an embodiment of the present application measure the distance between the target object and the movable platform;
  • FIG. 4 is a schematic flowchart of sub-steps of the distance determination method in FIG. 1;
  • Fig. 5 is a schematic flowchart of sub-steps of the distance determination method in Fig. 1;
  • Fig. 6 is a schematic block diagram of the structure of an unmanned aerial vehicle provided by an embodiment of the present application.
  • This application provides a distance determination method, a movable platform, and a computer-readable storage medium.
  • the distance determination method is applied to a movable platform.
  • the movable platform 100 includes a vision sensor 110 and a time of flight (Time Of Flight).
  • TOF distance measuring device 120 TOF distance measuring device 120 includes a transmitting device for emitting light signals and a receiving device for receiving light signals reflected by the target object.
  • the vision sensor 110 may be a monocular vision device or Binocular vision device
  • the target object is the spot area formed by the light signal emitted by the transmitter of the TOF distance measuring device irradiating the object object
  • the movable platform and the target object (object object) can be measured by the TOF distance measuring device 120
  • the first distance between the two images of the target object output by the vision sensor 110 can determine the second distance between the movable platform and the multiple spatial points on the target object, and the second distance between the first distance and the multiple second The distance can accurately determine the distance between the movable platform and the target object (object object).
  • movable platforms include drones, mobile robots, and pan-tilt vehicles.
  • Unmanned aerial vehicles include rotary-wing drones, such as quad-rotor drones, hexa-rotor drones, and octo-rotor drones. It is a fixed-wing UAV, or a combination of a rotary-wing type and a fixed-wing UAV, which is not limited here.
  • FIG. 2 is a schematic flowchart of steps of a method for determining a distance according to an embodiment of the present application. Specifically, as shown in FIG. 2, the distance determination method includes step S101 to step S104.
  • Step S101 Obtain a first distance between the movable platform and the target object collected by the TOF distance measuring device.
  • the TOF distance measuring device includes a transmitting device for transmitting light signals and a receiving device for receiving the light signals reflected by the target object.
  • the light source of the transmitting device is an infrared light source, and the transmitting device of the TOF distance measuring device transmits the light signal and records the emission At the moment, when the light signal emitted by the transmitter meets the object, it will form a light spot on the surface of the object, thereby obtaining the target object on the object. At the same time, the light signal is emitted through the surface of the object, and the light reflected by the target object is obtained.
  • the receiving device of the TOF distance measuring device can receive the light signal reflected by the target object, and record the receiving time point.
  • the transmission time point and the receiving time point can be calculated to obtain the optical signal between the movable platform and the target object.
  • the flight time, and then the first distance between the movable platform and the target object can be calculated according to the flight time and the speed of light.
  • the target object is the light spot area formed by the light signal emitted by the transmitter of the TOF distance measuring device irradiating the object.
  • Step S102 Acquire a first image and a second image of the target object output by the vision sensor.
  • the first image includes the first target image area
  • the first target image area is the area where the target object is projected on the first image
  • the second image includes the second target image area
  • the second target image area is the target
  • the area where the object is projected on the second image, the position of the center pixel of the first target image area in the first image and the position of the center pixel of the second target image area in the second image are based on the vision sensor and TOF
  • the installation position relationship between the distance measuring devices is determined, and the position of the first target image area in the first image and the position of the second target image area in the second image are based on the installation position between the vision sensor and the TOF distance measuring device
  • the relationship and the first distance are determined. According to the installation position relationship between the vision sensor and the TOF distance measuring device, the rotation matrix and the displacement matrix between the vision sensor and the TOF distance measuring device can be determined.
  • the vision sensor includes any one of a monocular vision device and a binocular vision device. If the vision sensor is a monocular vision device, the first image and the second image are separated by a preset time; if the vision sensor If it is a binocular vision device, the first image is the image output by the first camera in the binocular vision device, and the second image is the image output by the second camera in the binocular vision device.
  • the vision sensor is a monocular vision device
  • the first image and the second image are separated by a preset time; if the vision sensor If it is a binocular vision device, the first image is the image output by the first camera in the binocular vision device, and the second image is the image output by the second camera in the binocular vision device.
  • the rotation matrix and displacement matrix between the TOF distance measuring device and the TOF distance measuring device determine the position point of the center space point of the target object under the angle of view of the first camera; mark the position point in the preset image of the first camera, And determine the three-dimensional position coordinates of the multiple spatial points on the target object according to the first distance; according to the three-dimensional position coordinates of the multiple spatial points, project the multiple spatial points on the target object into the preset image marked with the location point , To obtain the first image containing the first target image area.
  • the internal parameter matrix of the second camera in the binocular vision device and the rotation matrix and displacement matrix between the second camera and the TOF distance measuring device obtain the internal parameter matrix of the second camera in the binocular vision device and the rotation matrix and displacement matrix between the second camera and the TOF distance measuring device; according to the internal parameter matrix of the second camera and the second camera and The rotation matrix and displacement matrix between the TOF distance measuring device determine the position point of the center space point of the target object under the angle of view of the second camera; mark the position point in the preset image of the second camera, and Determine the three-dimensional position coordinates of the multiple spatial points on the target object according to the first distance; according to the three-dimensional position coordinates of the multiple spatial points, project the multiple spatial points on the target object into a preset image marked with the location point, Obtain a second image containing the second target image area.
  • the central spatial point of the target object can be determined in advance according to the angle of view of the first camera. And mark the position point in the preset image of the first camera to obtain the first marked image.
  • the internal parameter matrix of the second camera and the relationship between the second camera and the TOF distance measuring device in advance Between the rotation matrix and the displacement matrix, determine the position of the center space point of the target object under the angle of view of the second camera, and mark the position in the preset image of the second camera to obtain the second marked image , And then store the first mark image and the second mark image in the memory of the movable platform.
  • the first mark image and the second mark image can be directly obtained. Real-time determination is needed to reduce the amount of calculation on the movable platform.
  • the method of determining the three-dimensional position coordinates of the multiple spatial points on the target object according to the first distance is specifically: obtaining the effective field of view of the light source of the TOF distance measuring device, and according to the effective field of view of the light source and The first distance is to determine the three-dimensional position coordinates of the multiple spatial points on the target object relative to the TOF distance measuring device.
  • the effective field of view of the light source of the TOF distance measuring device is determined according to the aperture of the light source emission port of the TOF distance measuring device.
  • the light signal emitted by the light source is emitted to the outside through the light source emission port.
  • the larger the aperture of the light source emission port The larger the effective field angle of the light source and the smaller the aperture of the light source emission port, the smaller the effective field angle of the light source.
  • the effective field angle of the light source is 10°.
  • the light signal emitted by the transmitting device 121 of the TOF distance measuring device 120 irradiates the object Q to form a circular spot area, that is, the target object P, and the receiving device 122 of the TOF distance measuring device 120 receives According to the light signal reflected by the target object, according to the installation position relationship between the first camera 111 and the TOF distance measuring device, the internal parameter matrix and the first distance, the theoretical area P 1 of the target object projected in the first image 10 can be determined, According to the installation position relationship between the second camera 112 and the TOF distance measuring device, the internal parameter matrix, and the first distance, the theoretical area P 2 of the target object projected in the second image 20 can be determined.
  • the shape of the light spot area may be a circle, an ellipse or a rectangle, which is not specifically limited in this application.
  • Step S103 Determine a second distance between the movable platform and the multiple spatial points on the target object according to the first image and the second image.
  • the depth value between the movable platform and the multiple spatial points on the target object can be determined according to the first image and the second image, that is, The second distance between the movable platform and the multiple spatial points on the target object.
  • step S103 specifically includes: sub-steps S1031 to S1032.
  • the first feature point corresponding to the multiple spatial points on the target object is extracted from the first image based on the preset feature point extraction algorithm; the second image is determined to match the first feature point based on the preset feature point tracking algorithm The second feature point of the target object is obtained, and the matching pairs of feature points corresponding to the multiple spatial points on the target object are obtained; or, based on a preset feature point extraction algorithm, the corresponding multiple spatial points on the target object can be extracted from the second image A first feature point; a second feature point that matches the first feature point is determined from the first image based on a preset feature point tracking algorithm, and a feature point matching pair corresponding to a plurality of spatial points on the target object is obtained.
  • the preset feature point extraction algorithm includes at least one of the following: corner detection algorithm (Harris Corner Detection), scale-invariant feature transform (SIFT) algorithm, scale and rotation invariant feature transform (Speeded- Up Robust Features, SURF) algorithm, FAST (Features From Accelerated Segment Test) feature point detection algorithm, preset feature point tracking algorithm includes but not limited to KLT (Kanade-Lucas-Tomasi feature tracker) algorithm.
  • corner detection algorithm Harris Corner Detection
  • SIFT scale-invariant feature transform
  • SURF scale and rotation invariant feature transform
  • FAST Features From Accelerated Segment Test
  • preset feature point tracking algorithm includes but not limited to KLT (Kanade-Lucas-Tomasi feature tracker) algorithm.
  • the first target image area in the first image and the second target image area in the second image of the target object are determined; the target object is determined from the first target image area and the second target image area.
  • the multiple spatial points correspond to the matching pairs of feature points.
  • the position of the first target image area in the first image and the position of the second target image area in the second image are determined according to the installation position relationship between the vision sensor and the TOF distance measuring device and the first distance, and according to the vision sensor
  • the installation position relationship with the TOF distance measuring device can determine the rotation matrix and the displacement matrix between the vision sensor and the TOF distance measuring device.
  • the first target image area in the first image and the second target image area in the second image of the target object can be accurately determined through the installation position relationship between the vision sensor and the TOF distance measuring device and the first distance.
  • the method for determining the first target image area in the first image and the second target image area in the second image of the target object is specifically: determining the three-dimensional position coordinates of the target object according to the first distance, that is, Obtain the effective field of view of the light source of the TOF distance measuring device, and determine the three-dimensional position coordinates of multiple spatial points on the target object relative to the TOF distance measuring device according to the effective field of view of the light source and the first distance; according to the three-dimensional position coordinates , Project the target object into the first image and the second image to determine the first target image area and the second target image area.
  • the first distance determined by the TOF distance measuring device and the effective field of view angle of the light source can determine the three-dimensional position coordinates of multiple spatial points on the target object, and then based on the three-dimensional position coordinates, the target object can be projected onto the first image and the second image. In the image, the first target image area and the second target image area can be accurately identified.
  • the method of projecting the target object into the first image to determine the first target image area according to the three-dimensional position coordinates of the target object is specifically: acquiring the binocular vision device The internal parameter matrix of the first camera in and the rotation matrix and displacement matrix between the first camera and the TOF distance measuring device; according to the internal parameter matrix of the first camera, the rotation between the first camera and the TOF distance measuring device Matrix and displacement matrix, and the three-dimensional position coordinates of multiple spatial points on the target object, determine the two-dimensional position coordinates of multiple spatial points on the target object in the first camera coordinate system; according to the multiple spatial points on the target object The two-dimensional position coordinates in the first camera coordinate system, the pixel points corresponding to multiple spatial points are marked in the first image, and the area where the circumscribed circle formed by the pixel points corresponding to the multiple spatial points is the first target image area.
  • the method of projecting the target object into the second image to determine the second target image area according to the three-dimensional position coordinates of the target object is specifically: acquiring the second target image area in the binocular vision device The internal parameter matrix of the second camera and the rotation matrix and displacement matrix between the second camera and the TOF distance measuring device; according to the internal parameter matrix of the second camera, the rotation matrix and the displacement between the second camera and the TOF distance measuring device Matrix and the three-dimensional position coordinates of multiple spatial points on the target object, determine the two-dimensional position coordinates of multiple spatial points on the target object in the second camera coordinate system; according to the multiple spatial points on the target object, the second camera In the two-dimensional position coordinates in the coordinate system, the pixel points corresponding to multiple spatial points are marked in the second image, and the area where the circumscribed circle formed by the pixel points corresponding to the multiple marked spatial points is located is the second target image area.
  • the second distance between the movable platform and the multiple spatial points on the target object can be determined.
  • the following uses a visual sensor as a binocular vision device as an example to explain the process of determining the second distance between the movable platform and the multiple spatial points based on multiple feature point matching pairs.
  • each feature point matching pair determines the corresponding pixel difference of each feature point matching pair; obtain the preset focal length and preset binocular distance of the binocular vision device; The focal length, the preset binocular distance, and the pixel difference corresponding to each feature point matching pair are set to determine the second distance between the movable platform and the multiple spatial points.
  • the preset focal length is determined by calibrating the focal length of the binocular vision device, and the preset binocular distance is determined according to the installation positions of the first camera and the second camera in the binocular vision device.
  • Step S104 Determine a target distance between the movable platform and the target object according to a plurality of the second distances and the first distance.
  • the target distance between the movable platform and the target object may be determined through the first distance and the multiple second distances. Since the distance measurement result between the TOF distance measuring device and the vision sensor is comprehensively considered, the distance between the movable platform and the object can be accurately measured.
  • step S104 specifically includes: sub-steps S1041 to S1042.
  • the credibility index of the first distance may be determined by the first distance and the plurality of second distances.
  • the credibility index is used to characterize the accuracy of the first distance between the movable platform and the target object collected by the TOF distance measuring device. The greater the credibility index, the greater the accuracy of the measured first distance. High, and the smaller the credibility index, the lower the accuracy of the measured first distance.
  • the target space point is determined from the plurality of space points according to the plurality of second distances and the first distance, wherein the difference between the second distance corresponding to the target space point and the first distance is less than or equal to The preset threshold; the credibility index of the first distance is determined according to the target space point.
  • the preset threshold may be set based on actual conditions, which is not specifically limited in this application, for example, the preset threshold is 0.5 meters.
  • the method of determining the credibility index of the first distance according to the target space point is specifically: determining the credibility index of the first distance according to the number of target space points and the number of space points, that is, counting the number of space points The space point whose difference between the second distance and the first distance is less than or equal to the preset threshold, that is, the first number of target space points, and the second number of space points are counted, and then the first number accounts for the second number Percentage, and the percentage of the first quantity to the second quantity is used as the credibility index of the first distance. For example, assuming that the number of target space points is 75 and the number of space points is 100, the percentage of the first number to the second number is 75%, so the credibility index of the first distance is 75%.
  • the method of determining the credibility index of the first distance according to the target space point is specifically: determining the first weight value according to the pixel coordinates and pixel value of the corresponding feature point of the target space point in the first image; The pixel coordinates and pixel values of the corresponding feature points in the first image or the second image are determined to determine the second weight value; the credibility index of the first distance is determined according to the first weight value and the second weight value, that is, the first distance is calculated.
  • the weight value accounts for the percentage of the second weight value, and the first weight value accounts for the percentage of the second weight value as the credibility index of the first distance. For example, if the first weight value is 0.5 and the second weight value is 0.7, the first weight value accounts for 71.4% of the second weight value. Therefore, the credibility index of the first distance is 71.4%.
  • the first weight value is determined in a specific manner: according to the pixel coordinates and pixel values of the corresponding feature points of each target space point in the first image, determine each target space Point weight value, and accumulate the weight value of each target space point to obtain the first weight value; similarly, the method for determining the second weight value is specifically as follows: according to each spatial point in the first image or the second image Corresponding to the pixel coordinates and pixel values of the feature points, the weight value of each spatial point is determined, and the weight value of each spatial point is accumulated to obtain the second weight value.
  • the method of determining the first weight value is specifically: determining the first weight value of the target object in the first image.
  • a target image area or a second target image area in the second image; the first weight value is determined according to the pixel coordinates and pixel values of the corresponding feature points of the target space point in the first target image area or the second target image area .
  • the pixel coordinates and pixel values of determine the second weight value.
  • Using the feature points in the first target image area or the second target image area to participate in the calculation of the first weight value or the second weight value can reduce the amount of calculation and increase the processing speed.
  • the method for determining the weight value of the target space point or the space point is specifically: substituting the pixel coordinates and pixel values of the corresponding feature point of the target space point or the space point in the first image or the second image into the weight value Calculate the formula to get the weight value of the target space point or space point.
  • the weight value calculation formula is I is the pixel value of the target spatial point or the corresponding feature point in the first image or the second image
  • (u x , v x ) is the corresponding target spatial point or the spatial point in the first image or the second image
  • the pixel coordinates of the feature point, (u 0 , v 0 ) are the pixel coordinates of the corresponding feature point of the center space point of the target object in the first image or the second image
  • e is a natural number
  • is a preset coefficient
  • is based on experience Sure.
  • the credibility index of the first distance can be calculated according to the following formula:
  • x i is the target space point in the set P
  • x j is the space point in the set P′
  • I i is the i-th target space point in the set P in the first image or the second image
  • the pixel value of the corresponding feature point, m is the number of target space points in the set P
  • I j is the j-th spatial point in the set P′ in the first image or the second image
  • the pixel value of the corresponding feature point, n is the number of spatial points in the set P′, and n is greater than m
  • (u 0 , v 0 ) is the correspondence of the central spatial point of the target object in the first image or the second image
  • the first weight value can be calculated by The second weight value can be calculated.
  • the credibility index of the first distance After the credibility index of the first distance is determined, it is determined whether the credibility index is greater than the preset credibility index; if the credibility index is greater than the preset credibility index, the multiple second distances and the first distance are fused To determine the target distance between the movable platform and the target object.
  • the preset credibility index can be set according to actual conditions, which is not specifically limited in this application, for example, the preset credibility index is 75%.
  • the method of performing fusion processing on the multiple second distances and the first distance is specifically: determining the third distance according to the multiple second distances, and obtaining the first weight coefficient and the third distance of the first distance
  • the second weight coefficient of the calculate the product of the first distance and the first weight coefficient and calculate the product of the third distance and the second weight coefficient, and the product of the first distance and the first weight coefficient and the third distance and the second weight
  • the product of the coefficients is accumulated to obtain the target distance between the movable platform and the target object.
  • the sum of the first weight coefficient and the second weight coefficient is 1, and the first weight coefficient and the second weight coefficient can be set according to actual conditions, for example, the first weight coefficient is 0.5 and the second weight coefficient is 0.5.
  • the method for determining the third distance according to the plurality of second distances is specifically: selecting any second distance from the plurality of second distances as the third distance, or calculating the first distance based on the plurality of second distances.
  • the average value of the second distance, and the average value of the second distance as the third distance or select the smallest second distance and the largest second distance from multiple second distances, and determine the smallest second distance and the largest second distance
  • the average value of the second distance, the average value of the smallest second distance and the largest second distance is taken as the third distance.
  • the target distance between the movable platform and the target object is determined according to a plurality of second distances, that is, any one of the plurality of second distances is selected.
  • the second distance is used as the target distance between the movable platform and the target object, or the average value of the second distance is calculated based on multiple second distances, and the average value of the second distance is used as the distance between the movable platform and the target object.
  • the target distance or select the smallest second distance and the largest second distance from multiple second distances, and determine the average value of the smallest second distance and the largest second distance, and combine the smallest second distance and the largest second distance
  • the average value of the second distance is used as the target distance between the movable platform and the target object.
  • the first distance between the movable platform and the target object is determined by the TOF distance measuring device, and the first image and the second image of the target object output by the vision sensor are used to determine the distance between the movable platform and the target object.
  • the second distance between the multiple spatial points on the target object, and then the target distance between the movable platform and the target object is determined according to the multiple second distances and the first distance. Since the distance measurement result between the TOF distance measuring device and the vision sensor is comprehensively considered, the distance between the movable platform and the object can be accurately measured.
  • FIG. 6 is a schematic block diagram of a movable platform provided by an embodiment of the present application.
  • the mobile platform 200 includes a processor 201, a memory 202, a vision sensor 203, and a TOF distance measuring device 204.
  • the processor 201, the memory 202, the vision sensor 203 and the TOF distance measuring device 204 are connected by a bus 205.
  • the bus 205 is, for example, I2C. (Inter-integrated Circuit) bus.
  • the mobile platform 200 includes drones, mobile robots, pan-tilt vehicles, etc.
  • the drones can be rotary-wing drones, such as quad-rotor drones, hexa-rotor drones, and octo-rotor drones. It can also be a fixed-wing UAV, or a combination of a rotary-wing type and a fixed-wing UAV, which is not limited here.
  • the processor 201 may be a micro-controller unit (MCU), a central processing unit (CPU), a digital signal processor (Digital Signal Processor, DSP), or the like.
  • MCU micro-controller unit
  • CPU central processing unit
  • DSP Digital Signal Processor
  • the memory 202 may be a Flash chip, a read-only memory (ROM, Read-Only Memory) disk, an optical disk, a U disk, or a mobile hard disk.
  • the TOF distance measuring device 204 includes a transmitting device for transmitting light signals and a receiving device for receiving light signals reflected by the target object.
  • the processor 201 is configured to run a computer program stored in the memory 202, and implement the following steps when the computer program is executed:
  • the target distance between the movable platform and the target object is determined according to the plurality of second distances and the first distance.
  • the processor 201 realizes that the target distance between the movable platform and the target object is determined according to a plurality of the second distances and the first distance, it is used to realize:
  • the target distance between the movable platform and the target object is determined according to the credibility index, a plurality of the second distances, and the first distance.
  • the processor 201 when the processor 201 implements the determination of the credibility index of the first distance according to a plurality of the second distances and the first distance, it is used to implement:
  • a target space point is determined from the plurality of space points, wherein the second distance corresponding to the target space point is between the first distance
  • the difference of is less than or equal to the preset threshold
  • the credibility index of the first distance is determined according to the target space point.
  • the processor 201 when the processor 201 implements the determination of the credibility index of the first distance according to the target spatial point, it is used to implement:
  • the credibility index of the first distance is determined according to the number of the target space points and the number of the space points.
  • the processor 201 when the processor 201 implements the determination of the credibility index of the first distance according to the target spatial point, it is used to implement:
  • the credibility index of the first distance is determined according to the first weight value and the second weight value.
  • the processor 201 determines the first weight value according to the pixel coordinates and pixel values of the corresponding feature points of the target space point in the first image or the second image, it is used to achieve:
  • the first weight value is determined according to the pixel coordinates and pixel values of the corresponding feature points of the target space point in the first target image area or the second target image area.
  • the processor 201 when the processor 201 is configured to determine the target distance between the movable platform and the target object according to the credibility index, a plurality of the second distances, and the first distance, accomplish:
  • fusion processing is performed on a plurality of the second distances and the first distances to determine the target distance between the movable platform and the target object.
  • the processor 201 determines whether the credibility index is greater than a preset credibility index, it is further used to implement:
  • the target distance between the movable platform and the target object is determined according to a plurality of the second distances.
  • the processor 201 determines the second distance between the movable platform and the multiple spatial points on the target object according to the first image and the second image, it is used to achieve:
  • a second distance between the movable platform and the plurality of spatial points is determined.
  • the processor 201 when the processor 201 implements the determination of the feature point matching pairs corresponding to the multiple spatial points on the target object from the first image and the second image, it is used to implement:
  • the feature point matching pairs corresponding to the multiple spatial points on the target object are determined.
  • the position of the first target image area in the first image and the position of the second target image area in the second image are based on the difference between the vision sensor and the TOF distance measuring device.
  • the installation position relationship between the two and the first distance is determined.
  • the processor 201 when the processor 201 implements the determination of the first target image area in the first image and the second target image area in the second image of the target object, it is used to implement:
  • the target object is projected into the first image and the second image to determine the first target image area and the second target image area.
  • the vision sensor 203 includes any one of a monocular vision device and a binocular vision device.
  • the vision sensor 203 is a monocular vision device, the first image and the second image are separated by a preset time;
  • the vision sensor is a binocular vision device
  • the first image is the image output by the first camera in the binocular vision device
  • the second image is the second image in the binocular vision device. The image output by the camera.
  • the embodiments of the present application also provide a computer-readable storage medium, the computer-readable storage medium stores a computer program, the computer program includes program instructions, and the processor executes the program instructions to implement the foregoing implementation The steps of the distance determination method provided in the example.
  • the computer-readable storage medium may be the internal storage unit of the removable platform described in any of the foregoing embodiments, such as the hard disk or memory of the removable platform.
  • the computer-readable storage medium may also be an external storage device of the removable platform, such as a plug-in hard disk equipped on the removable platform, a smart memory card (Smart Media Card, SMC), and Secure Digital (Secure Digital). , SD) card, flash card (Flash Card), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Measurement Of Optical Distance (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un procédé de détermination de distance, un terminal de commande, un véhicule aérien sans pilote et un support de stockage. Le procédé consiste : à obtenir une première distance acquise par un dispositif de télémétrie TOF (S101) ; à obtenir une première image et une seconde image délivrées par un capteur de vision (S102) ; à déterminer de multiples secondes distances en fonction de la première image et de la seconde image (S103) ; et à déterminer une distance cible en fonction des multiples secondes distances et de la première distance (S104). Le procédé permet une mesure précise de la distance entre une plateforme mobile et un objet.
PCT/CN2020/082199 2020-03-30 2020-03-30 Procédé de détermination de distance, plateforme mobile, et support de stockage lisible par ordinateur WO2021195886A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202080005139.9A CN112771575A (zh) 2020-03-30 2020-03-30 距离确定方法、可移动平台及计算机可读存储介质
PCT/CN2020/082199 WO2021195886A1 (fr) 2020-03-30 2020-03-30 Procédé de détermination de distance, plateforme mobile, et support de stockage lisible par ordinateur

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/082199 WO2021195886A1 (fr) 2020-03-30 2020-03-30 Procédé de détermination de distance, plateforme mobile, et support de stockage lisible par ordinateur

Publications (1)

Publication Number Publication Date
WO2021195886A1 true WO2021195886A1 (fr) 2021-10-07

Family

ID=75699498

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/082199 WO2021195886A1 (fr) 2020-03-30 2020-03-30 Procédé de détermination de distance, plateforme mobile, et support de stockage lisible par ordinateur

Country Status (2)

Country Link
CN (1) CN112771575A (fr)
WO (1) WO2021195886A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110794861A (zh) * 2019-11-14 2020-02-14 国网山东省电力公司电力科学研究院 一种飞行上下线绝缘子串检测机器人自主落串方法及系统

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113269824B (zh) * 2021-05-28 2023-07-07 陕西工业职业技术学院 一种基于图像的距离确定方法及系统
CN114396911B (zh) * 2021-12-21 2023-10-31 中汽创智科技有限公司 一种障碍物测距方法、装置、设备及存储介质
CN116990830B (zh) * 2023-09-27 2023-12-29 锐驰激光(深圳)有限公司 基于双目和tof的距离定位方法、装置、电子设备及介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102914262A (zh) * 2012-09-29 2013-02-06 北京控制工程研究所 一种基于附加视距的非合作目标贴近测量方法
CN105572681A (zh) * 2014-10-31 2016-05-11 洛克威尔自动控制安全公司 飞行时间传感器的绝对距离测量
CN107093195A (zh) * 2017-03-10 2017-08-25 西北工业大学 一种激光测距与双目相机结合的标记点定位方法
CN108037768A (zh) * 2017-12-13 2018-05-15 常州工学院 无人机避障控制系统、避障控制方法和无人机
US10346995B1 (en) * 2016-08-22 2019-07-09 AI Incorporated Remote distance estimation system and method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102999939B (zh) * 2012-09-21 2016-02-17 魏益群 坐标获取装置、实时三维重建系统和方法、立体交互设备
WO2017008224A1 (fr) * 2015-07-13 2017-01-19 深圳市大疆创新科技有限公司 Procédé de détection de distance à un objet mobile, dispositif et aéronef
CN107687841A (zh) * 2017-09-27 2018-02-13 中科创达软件股份有限公司 一种测距方法及装置
CN109902725A (zh) * 2019-01-31 2019-06-18 北京达佳互联信息技术有限公司 移动目标的检测方法、装置及电子设备和存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102914262A (zh) * 2012-09-29 2013-02-06 北京控制工程研究所 一种基于附加视距的非合作目标贴近测量方法
CN105572681A (zh) * 2014-10-31 2016-05-11 洛克威尔自动控制安全公司 飞行时间传感器的绝对距离测量
US10346995B1 (en) * 2016-08-22 2019-07-09 AI Incorporated Remote distance estimation system and method
CN107093195A (zh) * 2017-03-10 2017-08-25 西北工业大学 一种激光测距与双目相机结合的标记点定位方法
CN108037768A (zh) * 2017-12-13 2018-05-15 常州工学院 无人机避障控制系统、避障控制方法和无人机

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110794861A (zh) * 2019-11-14 2020-02-14 国网山东省电力公司电力科学研究院 一种飞行上下线绝缘子串检测机器人自主落串方法及系统

Also Published As

Publication number Publication date
CN112771575A (zh) 2021-05-07

Similar Documents

Publication Publication Date Title
WO2021195886A1 (fr) Procédé de détermination de distance, plateforme mobile, et support de stockage lisible par ordinateur
KR102054455B1 (ko) 이종 센서 간의 캘리브레이션 장치 및 방법
CN109146947B (zh) 海洋鱼类三维图像获取及处理方法、装置、设备及介质
WO2019238127A1 (fr) Procédé, appareil et système de mesure de distance
WO2021063128A1 (fr) Procédé de détermination de pose d'un corps rigide actif dans un environnement à caméra unique, et appareil associé
US20130322697A1 (en) Speed Calculation of a Moving Object based on Image Data
WO2018227576A1 (fr) Procédé et système de détection de forme de sol, procédé d'atterrissage de drone et drone
CN112288825B (zh) 相机标定方法、装置、电子设备、存储介质和路侧设备
CN105627932A (zh) 一种基于双目视觉的测距方法及装置
CN112967344B (zh) 相机外参标定的方法、设备、存储介质及程序产品
CN113111513B (zh) 传感器配置方案确定方法、装置、计算机设备及存储介质
WO2022218161A1 (fr) Procédé et appareil d'appariement de cible, dispositif, et support de stockage
CN111739099B (zh) 预防跌落方法、装置及电子设备
CN110738703A (zh) 定位方法及装置、终端、存储介质
CN111798507A (zh) 一种输电线安全距离测量方法、计算机设备和存储介质
CN112686951A (zh) 用于确定机器人位置的方法、装置、终端及存储介质
CN113450334B (zh) 一种水上目标检测方法、电子设备及存储介质
CN112184828B (zh) 激光雷达与摄像头的外参标定方法、装置及自动驾驶车辆
CN117250956A (zh) 一种多观测源融合的移动机器人避障方法和避障装置
CN116929290A (zh) 一种双目视角差立体深度测量方法、系统及存储介质
CN113959398B (zh) 基于视觉的测距方法、装置、可行驶设备及存储介质
Vaida et al. Automatic extrinsic calibration of LIDAR and monocular camera images
CN113112551B (zh) 相机参数的确定方法、装置、路侧设备和云控平台
CN112330726B (zh) 一种图像处理方法及装置
CN113014899B (zh) 一种双目图像的视差确定方法、装置及系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20928232

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20928232

Country of ref document: EP

Kind code of ref document: A1