WO2022262332A1 - Procédé et appareil d'étalonnage pour système de fusion d'un dispositif de mesure de distance et d'une caméra - Google Patents

Procédé et appareil d'étalonnage pour système de fusion d'un dispositif de mesure de distance et d'une caméra Download PDF

Info

Publication number
WO2022262332A1
WO2022262332A1 PCT/CN2022/080517 CN2022080517W WO2022262332A1 WO 2022262332 A1 WO2022262332 A1 WO 2022262332A1 CN 2022080517 W CN2022080517 W CN 2022080517W WO 2022262332 A1 WO2022262332 A1 WO 2022262332A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
spot
measuring device
distance measuring
coordinate information
Prior art date
Application number
PCT/CN2022/080517
Other languages
English (en)
Chinese (zh)
Inventor
刘浏
陈首彬
陈文胜
刘贤焯
闫敏
Original Assignee
深圳奥锐达科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳奥锐达科技有限公司 filed Critical 深圳奥锐达科技有限公司
Publication of WO2022262332A1 publication Critical patent/WO2022262332A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/03Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring coordinates of points
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images

Definitions

  • the present application belongs to the technical field of distance measurement, and in particular relates to a calibration method and device for a distance measurement device and a camera fusion system.
  • the distance measuring device includes a depth camera or LiDAR (LiDAR, Light Detection and Ranging) based on the time-of-flight principle, for acquiring three-dimensional data of the target.
  • LiDAR LiDAR, Light Detection and Ranging
  • the camera can provide rich visual texture information to make up for the shortcomings in recognition and cognition caused by the low resolution of the distance measurement device; at the same time, the direct 3D geometric measurement of the distance measurement device can also Make up for the camera's lack of depth estimation and provide more accurate depth information.
  • the first problem to be solved for multi-sensor fusion is how to calibrate the data of different sensors into the same coordinate system.
  • the high-precision calibration of the distance measurement device and camera fusion system is the data fusion process. basis and premise.
  • the most commonly used calibration method is to fit the 3D geometric features of the calibration object from the 3D measurement data of the distance measuring device through a large-size checkerboard or other specific geometric calibration objects, and extract the corresponding 3D geometric features from the two-dimensional images collected by the camera 2D features, based on the geometric constraints of the calibration object, establish the relationship between 3D and 2D, and complete the external parameter calibration between multiple sensors (the relative position relationship between sensor coordinate systems is described by rotation matrix and translation vector in 3D space) .
  • this method requires the help of special markers or calibration scenes to fit the 3D geometric features of the calibration object.
  • the method has poor adaptability and is not flexible enough.
  • the sparse resolution of the distance measurement device will also lead to limited calibration accuracy; and
  • the internal reference calibration of the distance measurement device is difficult and has low precision, which will also affect the accuracy of the 3D measurement results, thereby affecting the calibration accuracy of the external reference between multiple sensors.
  • Embodiments of the present application provide a calibration method and device for a distance measuring device and a camera fusion system, which can solve the above problems.
  • the embodiment of the present application provides a calibration method for a distance measuring device and a camera fusion system, including:
  • controlling the distance measuring device to project a spot beam to the calibration object and collecting the spot beam to obtain a first target image, and synchronously controlling the camera to collect the calibration object to obtain a second target image;
  • An extrinsic parameter between the distance measuring device and the camera is calculated according to the target point pair.
  • the acquiring the three-dimensional coordinate information of the spot corresponding to the spot beam according to the first target image includes:
  • the three-dimensional coordinate information of the light spot corresponding to the spot light beam in the world coordinate system is calculated.
  • the determining multiple pairs of target points according to the three-dimensional coordinate information and the two-dimensional coordinate information corresponding to a plurality of the light spots includes:
  • the three-dimensional coordinate information is matched with the two-dimensional coordinate information according to a preset projection rule and the plane feature information of the calibration object to obtain multiple pairs of target points.
  • the calculating the extrinsic parameters between the distance measuring device and the camera according to the target point pair includes:
  • An iterative calculation is performed according to the target point pair and the reprojection error function model to obtain a minimum reprojection error and an extrinsic parameter between the distance measuring device and the camera.
  • the method further includes:
  • the calibration object is a diffuse reflection plane calibration plate or a stepped target object.
  • the embodiment of the present application provides a calibration device for a distance measuring device and a camera fusion system, including:
  • a control unit configured to control the distance measuring device to project a spot beam to the calibration object and collect the spot beam to obtain a first target image, and synchronously control the camera to collect the calibration object to obtain a second target image;
  • a first acquiring unit configured to acquire, according to the first target image, three-dimensional coordinate information of a light spot corresponding to the spot beam in a world coordinate system
  • a second acquiring unit configured to acquire two-dimensional coordinate information of a spot corresponding to the spot beam in a pixel coordinate system according to the second target image
  • a determining unit configured to determine multiple pairs of target points according to the three-dimensional coordinate information and the two-dimensional coordinate information corresponding to the plurality of spots;
  • a calculating unit configured to calculate an extrinsic parameter between the distance measuring device and the camera according to the target point pair.
  • the first acquisition unit is specifically used for:
  • the three-dimensional coordinate information of the light spot corresponding to the spot light beam in the world coordinate system is obtained by calculation.
  • the determining unit is specifically used for:
  • the three-dimensional coordinate information is matched with the two-dimensional coordinate information according to a preset projection rule and the plane feature information of the calibration object to obtain multiple pairs of target points.
  • calculation unit is specifically used for:
  • An iterative calculation is performed according to the target point pair and the reprojection error function model to obtain a minimum reprojection error and an extrinsic parameter between the distance measuring device and the camera.
  • the calibration device of the distance measuring device and the camera fusion system also includes:
  • a processing unit configured to optimize the first internal reference initial value of the depth camera of the distance measuring device and the second internal reference initial value of the camera in the camera fusion system, to obtain the optimized first internal reference value of the distance measuring device A target internal reference and an optimized second target internal reference of the camera.
  • the calibration object is a diffuse reflection plane calibration plate or a stepped target object.
  • an embodiment of the present application provides a calibration device for a distance measuring device and a camera fusion system, including a memory, a processor, and a computer program stored in the memory and operable on the processor.
  • the processor executes the computer program, the method for calibrating the distance measurement device and the camera fusion system as described in the first aspect is realized.
  • an embodiment of the present application provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the distance measurement as described in the above-mentioned first aspect is implemented Calibration method for device and camera fusion system.
  • the distance measuring device is controlled to project a spot beam to the calibration object and collect the spot beam to obtain the first target image, and synchronously control the camera to collect the calibration object to obtain the second target image; obtain the spot corresponding to the spot beam according to the first target image Three-dimensional coordinate information in the world coordinate system; obtain the two-dimensional coordinate information of the light spot corresponding to the spot beam in the pixel coordinate system according to the second target image; determine multiple pairs of targets according to the three-dimensional coordinate information and two-dimensional coordinate information corresponding to multiple light spots point pair: calculating an extrinsic parameter between the distance measuring device and the camera according to the target point pair.
  • the above method can realize the offline calibration of the distance measurement device and the camera fusion system, and use the camera with infrared sensitivity to image the light spot projected by the distance measurement device, that is, control the camera to image the emitted light spot under the condition of ensuring time synchronization. It can accurately realize 3D-2D matching, so as to realize high-precision offline calibration and calibration, and break through the limitation of the resolution of the distance measuring device on the calibration accuracy, and can realize large-scale, high-precision external reference calibration in industrial production.
  • FIG. 1 is a schematic flowchart of a calibration method for a distance measuring device and a camera fusion system provided in the first embodiment of the present application;
  • Fig. 2 is a schematic diagram of a spatial encoding rule in a calibration method of a distance measuring device and a camera fusion system provided in the first embodiment of the present application;
  • Fig. 3 is a schematic diagram of another spatial encoding rule in a calibration method of a distance measuring device and a camera fusion system provided in the first embodiment of the present application;
  • Fig. 4 is a schematic diagram of a step-shaped target in a calibration method of a distance measuring device and a camera fusion system provided in the first embodiment of the present application;
  • Fig. 5 is a schematic diagram of the calibration equipment of the distance measurement device and the camera fusion system provided by the second embodiment of the present application;
  • Fig. 6 is a schematic diagram of a calibration device of a distance measuring device and a camera fusion system provided by a third embodiment of the present application.
  • the term “if” may be construed, depending on the context, as “when” or “once” or “in response to determining” or “in response to detecting “.
  • the phrase “if determined” or “if [the described condition or event] is detected” may be construed, depending on the context, to mean “once determined” or “in response to the determination” or “once detected [the described condition or event] ]” or “in response to detection of [described condition or event]”.
  • references to "one embodiment” or “some embodiments” or the like in the specification of the present application means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application.
  • appearances of the phrases “in one embodiment,” “in some embodiments,” “in other embodiments,” “in other embodiments,” etc. in various places in this specification are not necessarily All refer to the same embodiment, but mean “one or more but not all embodiments” unless specifically stated otherwise.
  • the terms “including”, “comprising”, “having” and variations thereof mean “including but not limited to”, unless specifically stated otherwise.
  • FIG. 1 is a schematic flowchart of a calibration method for a distance measuring device and a camera fusion system provided in the first embodiment of the present application.
  • a calibration method for a distance measuring device and a camera fusion system is executed by a device having a calibration function for a distance measuring device and a camera fusion system.
  • the distance measuring device and the camera fusion system Before describing the calibration method of the distance measuring device and the camera fusion system in detail, the distance measuring device and the camera fusion system will be described first.
  • the distance measurement device is a LiDAR or a depth camera using a fixed array emission mode, which can be an area array emission type or a mechanical scanning type LiDAR, or can also be based on the time-of-flight principle (including DTOF, ITOF, etc.)
  • the transmitter of the depth camera, distance measuring device comprises at least one light source for projecting a spot beam.
  • the distance measuring device includes a transmitter, a collector, a control and processing circuit.
  • Emitters include light sources, emitting optics, and more.
  • a beam splitting element is also included.
  • the light source can be a single light source or a light source array composed of multiple light sources.
  • the light source array can be configured to emit light in groups and divided into multiple sub-light source arrays. Each sub-light source array includes a row or a column of light sources, or any other form.
  • the emitter is controlled to emit spot beams, only one sub-light source array or only one light source in each sub-light source array can be turned on at a time, so as to generate a fixed point array and project on the target surface.
  • the light source is configured as a VCSEL (Vertical-Cavity Surface-Emitting Laser, Vertical-Cavity Surface-Emitting Laser) array light source, which emits arrays through column addressing or binary addressing, and passes through single or multiple lenses.
  • the emitting optical element is modulated and projected on the target surface in the form of a fixed point array.
  • the light source can use EEL (Edge-emitting Laser, edge emitting laser) or VCSEL to emit spot beams, and the emitting optical elements include collimating lenses and beam splitting elements, which are optically collimated after passing through the emitting optical elements.
  • the beam splitting element splits the beam, and also produces a fixed point array that is projected on the surface of the object.
  • the beam splitting element can be a diffractive optical element (Difractive Optical Element, DOE), a microlens array, etc.
  • the collector includes a pixel unit composed of at least one pixel, a filter unit and a receiving optical element, the receiving optical element images the speckle beam reflected by the target onto the pixel array, the filtering unit is used to filter out background light and stray light, and the pixel It can be one of photodetectors such as APD, SiPM, SPAD, CCD, and CMOS.
  • the pixel unit is an image sensor specially used for light time-of-flight measurement, and the pixel unit can also be integrated into a photosensitive chip specially used for light time-of-flight measurement.
  • the pixel unit is composed of a plurality of SPADs that can respond to an incident single photon and output a photon signal indicating the corresponding arrival time of the received photon at each SPAD.
  • the collector also includes a readout circuit composed of one or more of a signal amplifier connected to the pixel unit, a time-to-digital converter (TDC), a digital-to-analog converter (ADC) and the like (these circuits can be Integrate with pixels, as part of the collector, and also as part of the control and processing circuitry.
  • TDC time-to-digital converter
  • ADC digital-to-analog converter
  • the control and processing circuit can be an independent dedicated circuit, such as the independent circuit of the depth camera itself with computing power; it can also contain a general processing circuit, such as when the depth camera is integrated into smart terminals such as mobile phones, TVs, computers, etc., the terminal
  • the processor in the can perform the functions of the control and processing circuits.
  • the control and processing circuit simultaneously controls the emitter and the collector, and calculates the depth of the target based on the time difference or phase difference between the emitted beam and the reflected beam.
  • the control and processing circuit is regarded as a part of equipment having the calibration function of the distance measuring device and the camera fusion system.
  • the indirect (IToF) time-of-flight method can also be used to solve the time-of-flight by solving the phase information of the transmitted waveform, or by transmitting a modulated and coded continuous wave signal, and the receiver can indirectly solve the time-of-flight through signal processing methods such as correlation matching. For example: AMCW amplitude modulation continuous wave, FMCW frequency modulation continuous wave, coded pulse transmission, etc.
  • the above-mentioned different ranging schemes will not affect the realization of this scheme.
  • the distance measuring device is a mechanical scanning LiDAR.
  • the distance measuring device further includes a mechanical scanning unit, and the mechanical scanning element may be a vibrating mirror, a mirror, MEMS, a wedge mirror, a rotating motor, and the like.
  • the light source is configured as a point light source or a column light source
  • the pixel unit is configured as a single pixel or a pixel column
  • the scanning unit includes a rotating motor, etc.
  • the rotation axis performs 360-degree scanning.
  • the light source also emits in a fixed point array form, and scans and images the surrounding environment with the rotation of the entire transceiver system.
  • the light source is configured as a point light source or a column light source, which also produces a fixed point array to emit light.
  • the dot matrix beam emitted by the light source is projected onto the surface of the object through the mechanical scanning element. With the rotation of the mechanical scanning element Scan an image of an object.
  • the scanning unit includes a MEMS mirror, a vibrating mirror, etc., which are used to receive the spot beam emitted by the light source, deflect it and project it onto the surface of the object to form a fixed point array.
  • the camera in the fusion system described in this application is a high-resolution imaging camera.
  • the high-resolution imaging camera needs to ensure strict time synchronization with the distance measuring device. Clear imaging point by point or array spot. In the actual product use process, you can choose to visually perceive visible light or near-infrared light according to the actual use scene, and realize the fusion perception effect of the camera and the distance measuring device.
  • the calibration method of the distance measuring device and camera fusion system as shown in Figure 1 may include:
  • S101 Control the distance measuring device to project a spot beam to a calibration object and collect the spot beam to obtain a first target image, and synchronously control the camera to collect the calibration object to obtain a second target image.
  • the device controls the emitter in the distance measuring device to project a spot beam to the calibration object and collects the spot beam through the collector to obtain the first target image.
  • a calibration object is set, and the device controls the emitter in the distance measuring device to project a spot beam to the calibration object.
  • the calibration object is a diffuse reflection plane calibration plate or a stepped target object.
  • the light source can be a single light source or a light source array composed of multiple light sources, and the light source array can be configured to emit light in groups for projecting spot beams.
  • the diffuse reflection plane calibration plate can choose a plane with better flatness, for example, a white wall.
  • the emitter emits laser light to project the spot beam toward the diffuse reflection plane calibration plate. When projecting, it can be projected according to the preset projection rules.
  • the preset projection rules can improve the accuracy of spot position matching and avoid errors caused by adjacent spots. match.
  • the preset projection rules may include spatial encoding rules and temporal encoding rules.
  • the preset projection rules may include spatial encoding rules and temporal encoding rules.
  • the time coding rule means that during the calibration process, according to the time coding sequence, the emission control transmitter projects the spot beam on the surface of the diffuse reflection plane calibration plate, which facilitates the accurate matching of the light spot pair in the future.
  • the time coding rule refers to the order in which the light source is turned on during the calibration process.
  • the emitter is controlled to project only one spot beam at a time, and the emission sequence can be set in a certain order or randomly.
  • the spatial encoding rule means that during the calibration process, the light source array can be configured to include multiple sub-light source arrays.
  • the sub-light source array can be a row or a column of light sources, or any other form. Only one sub-light source array or only One light in each sublight array.
  • the light source is controlled to emit one-dimensional row/column by row, as shown in Figure 2, the light spots are emitted in sequence from left to right in columns, and only one column of light spots is projected and imaged by the camera at the same time.
  • coded transmission is realized according to spatial division, as shown in the upper left corner of Fig.
  • a typical sub-block is coded and scanned and transmitted according to the direction indicated by the arrow, and the transmission of other sub-blocks is consistent with this sub-block, and each sub-block is guaranteed at the same time Only one spot of light is projected and imaged.
  • the size design of the block and the arrow indication are only for understanding, not a fixed requirement of the scheme, and certain adjustments can be made in related schemes.
  • the spatial encoding rule is used, the robustness of spot matching can be improved, the spatial position between different spots can be effectively controlled, interference avoidance can be realized, the probability of false matching can be reduced or eliminated, and the calibration accuracy can be improved.
  • a special calibration object can be selected for the diffuse reflection plane calibration plate to better extract feature points for pairing.
  • a typical calibration object is shown in Figure 4, a stepped target. It is understandable that for mechanical scanning LiDAR, since LiDAR emits light beams at a faster scanning rate, and the exposure time of the camera is too long to image point by point, the actual imaging result is a continuous spot scanning trajectory. Joint calibration requires the design of special calibration objects to better extract 3D feature points.
  • a typical calibration object is a stepped target.
  • the device While controlling the distance measuring device to project a spot beam to the calibration object and collect the spot beam to obtain the first target image, the device synchronously controls the camera in the camera fusion system to collect the calibration object to obtain the second target image. In order to ensure the accuracy of the calibration, when the second target image is collected, it is necessary to strictly ensure the time synchronization with the first target image obtained by collecting the spot beam.
  • S102 Acquire three-dimensional coordinate information of a light spot corresponding to the spot light beam in a world coordinate system according to the first target image.
  • the device acquires the three-dimensional coordinate information of the light spot corresponding to the spot light beam in the world coordinate system according to the first target image.
  • the first target image is collected by the collector, and the coordinates of the light spot in the pixel coordinate system can be obtained from the first target image, and after obtaining the depth value corresponding to each light spot, it can be determined according to the internal reference of the distance measuring device. Calculate the three-dimensional coordinate information of the spot in the world coordinate system.
  • the device obtains the first coordinates of the light spot corresponding to the spot beam in the pixel coordinate system according to the first target image; the i-th light source in the transmitter is projected onto the calibration plate to form the i-th light spot, and the upper left corner of the light source array is Under the origin coordinate system established by the origin, the coordinates of the i-th light source are (x, y). Due to the optical conjugation of the emitter and collector in the distance measuring device, it can be considered that the light spot corresponding to the i-th spot beam is in the pixel
  • the first coordinate in the coordinate system is (x i , y i ).
  • the number i of the light spot projected on the calibration object can be known by controlling the order in which the light sources are turned on.
  • the collector can also be used as a camera, and the light spot is imaged on the collector, and the coordinates of the light spot in the pixel coordinate system can also be determined according to the imaging position of the light spot.
  • the device acquires the target depth value of the spot corresponding to the spot beam, and calculates the three-dimensional coordinate information of the spot corresponding to the spot beam according to the internal reference of the distance measuring device, the first coordinate and the target depth value.
  • the internal reference K D of the distance measuring device is pre-stored in the device, including variables such as focal length, distortion parameters, and pixel offset, so that the three-dimensional coordinate information of the light spot corresponding to the i-th spot beam in the world coordinate system can be calculated.
  • the first coordinate of the light spot corresponding to the i-th spot light beam in the pixel coordinate system is ( xi , y i ), Then get the ideal coordinate P' Wi (X′ Wi , Y′ Wi , Z′ Wi ) of the light spot corresponding to the i-th spot beam in the world coordinate system, use the distortion parameter ⁇ d to perform the distortion operation, and obtain the light spot in the world coordinate
  • the spatial coordinates P Wi (X Wi , Y Wi , Z Wi ) in the system that is, the three-dimensional coordinate information of the spot corresponding to the spot beam.
  • f d is the focal length of the distance measuring device
  • D is the target depth value corresponding to the i-th light spot
  • S103 Acquire two-dimensional coordinate information of a light spot corresponding to the spot light beam in a pixel coordinate system according to the second target image.
  • the device obtains the two-dimensional coordinate information of the spot corresponding to the spot beam in the pixel coordinate system according to the second target image, that is, the device can directly determine the two-dimensional coordinate information according to the imaging position of the spot corresponding to the spot beam in the second target image.
  • the device may mark the coordinates of the i-th light spot in the pixel coordinate system as p i (u i , v i ).
  • S104 Determine multiple pairs of target points according to the three-dimensional coordinate information and the two-dimensional coordinate information corresponding to the multiple light spots.
  • the device has obtained the 3D coordinate information and 2D coordinate information corresponding to multiple light spots.
  • the device can pair the 3D coordinate information with the 2D coordinate information, and mark the 3D coordinate information and 2D coordinate information corresponding to the same spot as a target point pair.
  • the pairing method is not limited in this embodiment.
  • the device when the device acquires the three-dimensional coordinate information and the two-dimensional coordinate information, the device may mark the three-dimensional coordinate information and the two-dimensional coordinate information, and then The three-dimensional coordinate information and the two-dimensional coordinate information of the same mark are marked as a target point pair.
  • the device can match the three-dimensional coordinate information with the two-dimensional coordinate information according to the preset projection rule and the plane feature information of the calibration object to obtain multiple pairs of target points.
  • the preset projection rule has been described in detail in S101, and will not be repeated here.
  • the device matches the three-dimensional coordinate information with the two-dimensional coordinate information according to the preset projection rule and the plane feature information of the calibration object.
  • the preset projection rule is a time coding rule
  • the device controls the transmitter once By projecting only one spot beam, the device can match the corresponding three-dimensional coordinate information and two-dimensional coordinate information at the same time, and obtain an initial point pair to complete the pairing of three-dimensional coordinate information and two-dimensional coordinate information.
  • the imaging position and arrangement of the light spots in the first target image and the second target image are matched to realize the three-dimensional coordinate information and the two-dimensional coordinate information pairing.
  • the spatial encoding method it is also necessary to perform parallax correction on the light spots in the first target image, and use the corrected first target image to match the second target image to realize the pairing of three-dimensional coordinate information and two-dimensional coordinate information .
  • the time coding rule and the space coding rule are usually combined in any form to modulate the spot beam projected by the emitter into the target object, no matter if combined, a certain pattern can be formed on the target object
  • the light spot pattern can be matched according to the arrangement and position of the light spots in the first target image and the second target image.
  • the calibration object is a step-shaped target object, and it is necessary to extract points at special positions in the first target image and the second image to determine the target point pair For example, the three-dimensional coordinate information corresponding to the pixel at the same corner position is extracted from the two images and matched with the two-dimensional coordinate information to obtain an initial point pair.
  • S105 Calculate an extrinsic parameter between the distance measuring device and the camera according to the target point pair.
  • the device calculates the external parameters between the distance measuring device and the camera according to all target point pairs, wherein the three-dimensional coordinate information and the two-dimensional coordinate information in each set of target point pairs follow the same geometric law.
  • the device can construct the corresponding relationship between the three-dimensional coordinate information and the two-dimensional coordinate information, and the corresponding relationship includes the external parameters between the distance measuring device and the camera.
  • the specific corresponding relationship is as follows:
  • K c is the internal reference of the camera
  • [R, T] is the external reference between the distance measurement device and the camera.
  • the external parameters between the distance measuring device and the camera can be accurately calculated through iterative calculations, and the calibration of the external parameters can be completed.
  • the PnP camera pose estimation algorithm uses several feature points with known coordinates, combined with their imaging in the camera output image, to solve the rotation matrix R and the translation matrix T, that is, to solve the extrinsic parameter .
  • a cost function when performing iterative calculations, can be constructed to determine when to end the iterative calculation to obtain the final result.
  • the cost function is minimized, the corresponding external parameters are the optimal external parameters.
  • the cost function can be a reprojection error function model, and the device constructs a reprojection error function model, and the initial reprojection error function model can be:
  • the device can optimize the initial value of the first internal reference of the distance measuring device.
  • the initial value of the second internal reference of the camera in the camera fusion system can also be optimized synchronously to obtain the optimized first target internal reference of the distance measuring device and the optimized second target internal reference of the camera.
  • the initial The reprojection error function model can be:
  • the reprojection error function model can be constructed as:
  • the coordinates of the i-th spot in the pixel coordinate system are marked as p i (u i , v i ), and the rotation matrix R and translation matrix T are external parameters.
  • the device may use the calculated minimum reprojection error as a quantitative evaluation standard for calibration accuracy.
  • the distance measuring device is controlled to project a speckle beam to the calibration object and collect the speckle beam to obtain the first target image, and synchronously control the camera in the fusion system to collect the calibration object to obtain the second target image; acquire the speckle according to the first target image
  • the three-dimensional coordinate information of the light spot corresponding to the beam in the world coordinate system obtain the two-dimensional coordinate information of the light spot corresponding to the spot beam in the pixel coordinate system according to the second target image; according to the three-dimensional coordinate information and two-dimensional coordinate information corresponding to multiple light spots Determining multiple pairs of target points; calculating an extrinsic parameter between the distance measuring device and the camera according to the target point pairs.
  • the above method can realize the offline calibration of the distance measurement device and the camera fusion system, and use the camera with infrared sensitivity to image the light spot projected by the distance measurement device, that is, control the camera to image the emitted light spot under the condition of ensuring time synchronization. It can accurately realize 3D-2D matching, thereby realizing real-time, high-precision calibration and calibration, and breaking through the limitation of the resolution of the distance measuring device on the calibration accuracy.
  • the distance measurement device in this embodiment can choose a distance measurement scheme based on single-point/multi-point scanning, point array transceiver and time-of-flight measurement principle.
  • the distance measurement device can be a variety of LiDAR solutions applied to autonomous driving or intelligent robot scenarios, such as array-based transceiver solutions (row-by-row and column-based transmission and reception or area array transceiver and reception), solutions based on rotating mirrors or MEMS scanning imaging, based on Multi-point transceiver mechanical rotation scanning scheme.
  • it can be a depth camera with a fixed dot matrix transceiver mode or a fixed point array transceiver mode through a diffractive optical element, including a depth (distance) measurement solution based on the iToF or dToF principle, which is used in mobile phones and similar consumer electronic devices.
  • a depth (distance) measurement solution based on the iToF or dToF principle, which is used in mobile phones and similar consumer electronic devices.
  • iToF or dToF principle which is used in mobile phones and similar consumer electronic devices.
  • the replacement of the above-mentioned underlying hardware solutions will not affect the overall integration and high-precision calibration solutions, and all similar solutions should be within the scope of protection of this patent.
  • FIG. 5 is a schematic diagram of a calibration device of a distance measuring device and a camera fusion system provided by a second embodiment of the present application. Each included unit is used to execute each step in the embodiment corresponding to FIG. 5 .
  • the calibration equipment 6 of the distance measuring device and the camera fusion system includes:
  • a control unit 510 configured to control the distance measuring device to project a spot beam to the calibration object and collect the spot beam to obtain a first target image, and synchronously control the camera to collect the calibration object to obtain a second target image;
  • the first acquiring unit 520 is configured to acquire, according to the first target image, the three-dimensional coordinate information of the light spot corresponding to the spot light beam in the world coordinate system;
  • the second acquiring unit 530 is configured to acquire, according to the second target image, the two-dimensional coordinate information of the light spot corresponding to the spot light beam in the pixel coordinate system;
  • a determining unit 540 configured to determine multiple pairs of target points according to the three-dimensional coordinate information and the two-dimensional coordinate information corresponding to the multiple light spots;
  • a calculating unit 550 configured to calculate an extrinsic parameter between the distance measuring device and the camera according to the target point pair.
  • the first acquiring unit 520 is specifically configured to:
  • the three-dimensional coordinate information of the light spot corresponding to the spot light beam in the world coordinate system is obtained by calculation.
  • the determining unit 540 is specifically used for:
  • the three-dimensional coordinate information is matched with the two-dimensional coordinate information according to a preset projection rule and the plane feature information of the calibration object to obtain multiple pairs of target points.
  • calculation unit 550 is specifically used for:
  • An iterative calculation is performed according to the target point pair and the reprojection error function model to obtain a minimum reprojection error and an extrinsic parameter between the distance measuring device and the camera.
  • the calibration device 5 of the distance measuring device and the camera fusion system also includes:
  • a processing unit configured to optimize the first internal reference initial value of the distance measuring device and the second internal reference initial value of the camera, to obtain the optimized first target internal reference of the distance measuring device and the optimization of the camera After the second target internal reference.
  • the calibration object is a diffuse reflection plane calibration plate or a stepped target object.
  • Fig. 6 is a schematic diagram of a calibration device of a distance measuring device and a camera fusion system provided by a third embodiment of the present application.
  • the calibration device 6 of the distance measurement device and camera fusion system of this embodiment includes: a processor 60, a memory 61, and a computer program stored in the memory 61 and operable on the processor 60 62.
  • the processor 60 executes the computer program 62, it implements the steps in the embodiment of the above-mentioned calibration method for each distance measurement device and camera fusion system, such as steps 101 to 105 shown in FIG. 1 .
  • the processor 60 executes the computer program 62, the functions of the modules/units in the above-mentioned device embodiments, such as the functions of the modules 510 to 550 shown in FIG. 5 , are realized.
  • the computer program 62 can be divided into one or more modules/units, and the one or more modules/units are stored in the memory 61 and executed by the processor 60 to complete this application.
  • the one or more modules/units may be a series of computer program instruction segments capable of completing specific functions, and the instruction segments are used to describe the operation of the computer program 62 in the calibration device 6 of the distance measurement device and camera fusion system Implementation process.
  • the computer program 62 can be divided into a control unit, a first acquisition unit, a second acquisition unit, a determination unit, and a calculation unit, and the specific functions of each unit are as follows:
  • a control unit configured to control the distance measuring device to project a spot beam to the calibration object and collect the spot beam to obtain a first target image, and synchronously control the camera to collect the calibration object to obtain a second target image;
  • a first acquiring unit configured to acquire, according to the first target image, three-dimensional coordinate information of a light spot corresponding to the spot beam in a world coordinate system
  • a second acquiring unit configured to acquire two-dimensional coordinate information of a spot corresponding to the spot beam in a pixel coordinate system according to the second target image
  • a determining unit configured to determine multiple pairs of target points according to the three-dimensional coordinate information and the two-dimensional coordinate information corresponding to the plurality of spots;
  • a calculating unit configured to calculate an extrinsic parameter between the distance measuring device and the camera according to the target point pair.
  • the calibration equipment of the distance measurement device and camera fusion system may include, but not limited to, a processor 60 and a memory 61 .
  • Fig. 6 is only an example of the calibration equipment 6 of the distance measurement device and the camera fusion system, and does not constitute a limitation to the calibration equipment 6 of the distance measurement device and the camera fusion system, and may include more than the illustration or fewer components, or a combination of certain components, or different components, for example, the calibration equipment of the distance measurement device and camera fusion system may also include input and output equipment, network access equipment, bus, etc.
  • the so-called processor 60 can be a central processing unit (Central Processing Unit, CPU), and can also be other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), Off-the-shelf programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
  • the memory 61 may be an internal storage unit of the calibration device 6 of the distance measurement device and camera fusion system, such as a hard disk or a memory of the calibration device 6 of the distance measurement device and camera fusion system.
  • the memory 61 can also be an external storage device of the calibration device 6 of the distance measurement device and camera fusion system, such as a plug-in hard disk equipped on the calibration device 6 of the distance measurement device and camera fusion system, a smart memory card (Smart Media Card, SMC), Secure Digital (Secure Digital, SD) card, Flash Card (Flash Card), etc.
  • the calibration device 6 of the distance measurement device and camera fusion system may also include both an internal storage unit and an external storage device of the calibration device 6 of the distance measurement device and camera fusion system.
  • the memory 61 is used to store the computer program and other programs and data required by the calibration equipment of the distance measuring device and camera fusion system.
  • the memory 61 can also be used to temporarily store data that has been output or will be output.
  • the embodiment of the present application also provides a network device, which includes: at least one processor, a memory, and a computer program stored in the memory and operable on the at least one processor, and the processor executes The computer program implements the steps in any of the above method embodiments.
  • the embodiment of the present application also provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the steps in each of the foregoing method embodiments can be realized.
  • An embodiment of the present application provides a computer program product.
  • the computer program product When the computer program product is run on a mobile terminal, the mobile terminal can implement the steps in the foregoing method embodiments when executed.
  • the integrated unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a computer-readable storage medium. Based on this understanding, all or part of the procedures in the methods of the above embodiments in the present application can be completed by instructing related hardware through computer programs, and the computer programs can be stored in a computer-readable storage medium.
  • the computer program When executed by a processor, the steps in the above-mentioned various method embodiments can be realized.
  • the computer program includes computer program code, and the computer program code may be in the form of source code, object code, executable file or some intermediate form.
  • the computer-readable medium may at least include: any entity or device capable of carrying computer program codes to the photographing device/terminal device, recording medium, computer memory, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), electrical carrier signal, telecommunication signal and software distribution medium.
  • ROM read-only memory
  • RAM random access memory
  • electrical carrier signal telecommunication signal and software distribution medium.
  • U disk mobile hard disk
  • magnetic disk or optical disk etc.
  • computer readable media may not be electrical carrier signals and telecommunication signals under legislation and patent practice.
  • the disclosed device/network device and method may be implemented in other ways.
  • the device/network device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of Optical Distance (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

La présente demande est applicable au domaine technique de la mesure de distance et concerne un procédé et un appareil d'étalonnage pour un système de fusion d'un dispositif de mesure de distance et d'une caméra. Le procédé consiste à : amener le dispositif de mesure de distance à projeter un faisceau ponctuel sur un objet d'étalonnage et capturer le faisceau ponctuel pour obtenir une première image cible, et amener de manière synchrone une caméra à photographier un objet d'étalonnage afin d'obtenir une seconde image cible; obtenir, en fonction de la première image cible, des informations de coordonnées tridimensionnelles du point lumineux correspondant au faisceau ponctuel dans le système de coordonnées universel; obtenir, en fonction de la seconde image cible, des informations de coordonnées bidimensionnelles du point lumineux correspondant au faisceau ponctuel dans un système de coordonnées de pixels; déterminer de multiples paires de points cibles en fonction des informations de coordonnées tridimensionnelles et des informations de coordonnées bidimensionnelles correspondant à de multiples points lumineux; et calculer des paramètres extrinsèques entre le dispositif de mesure de distance et la caméra en fonction des paires de points cibles. Le procédé peut réaliser une correspondance précise 3D-2D, ce qui permet d'obtenir un étalonnage hors ligne de précision élevée, et de dépasser la limitation de la résolution du dispositif de mesure de distance sur la précision d'étalonnage.
PCT/CN2022/080517 2021-06-18 2022-03-13 Procédé et appareil d'étalonnage pour système de fusion d'un dispositif de mesure de distance et d'une caméra WO2022262332A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110679904.1 2021-06-18
CN202110679904.1A CN113538591B (zh) 2021-06-18 2021-06-18 一种距离测量装置与相机融合系统的标定方法及装置

Publications (1)

Publication Number Publication Date
WO2022262332A1 true WO2022262332A1 (fr) 2022-12-22

Family

ID=78125161

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/080517 WO2022262332A1 (fr) 2021-06-18 2022-03-13 Procédé et appareil d'étalonnage pour système de fusion d'un dispositif de mesure de distance et d'une caméra

Country Status (2)

Country Link
CN (1) CN113538591B (fr)
WO (1) WO2022262332A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116336964A (zh) * 2023-05-31 2023-06-27 天津宜科自动化股份有限公司 一种物体轮廓信息的获取系统
CN116485917A (zh) * 2023-06-19 2023-07-25 擎翌(上海)智能科技有限公司 拍摄装置与雷达装置的联合标定方法、系统、设备及介质
CN116485918A (zh) * 2023-06-25 2023-07-25 天府兴隆湖实验室 一种标定方法、系统及计算机可读存储介质
CN116934871A (zh) * 2023-07-27 2023-10-24 湖南视比特机器人有限公司 一种基于标定物的多目系统标定方法、系统及存储介质
CN117315011A (zh) * 2023-11-30 2023-12-29 吉林珩辉光电科技有限公司 一种大气湍流中光斑中心定位方法及装置
CN117788781A (zh) * 2024-02-28 2024-03-29 深圳市易检车服科技有限公司 标定物识别方法、装置、电子设备及存储介质
CN117781854A (zh) * 2023-09-22 2024-03-29 深圳市创客工场科技有限公司 空间测量方法、数控机器及计算机可读存储介质

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113538591B (zh) * 2021-06-18 2024-03-12 深圳奥锐达科技有限公司 一种距离测量装置与相机融合系统的标定方法及装置
CN114301952B (zh) * 2021-11-16 2023-08-01 上海瓶钵信息科技有限公司 自标定系统、方法、设备及介质
CN114235351B (zh) * 2021-12-17 2023-10-31 深圳市先地图像科技有限公司 一种激光阵列中激光光斑偏移检测方法、系统及相关设备
CN114359411B (zh) * 2022-01-10 2022-08-09 杭州巨岩欣成科技有限公司 泳池防溺水目标检测方法、装置、计算机设备及存储介质
CN114305340B (zh) * 2022-01-12 2023-02-17 广州永士达医疗科技有限责任公司 一种应用于oct主机的分辨率检测方法及装置
CN114266836B (zh) * 2022-03-01 2022-05-13 中国科学院自动化研究所 基于振镜相机的主动视觉三维标定方法、系统和设备
CN115289974B (zh) * 2022-10-09 2023-01-31 思看科技(杭州)股份有限公司 孔位测量方法、装置、计算机设备和存储介质
CN116704152B (zh) * 2022-12-09 2024-04-19 荣耀终端有限公司 图像处理方法和电子设备

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130113704A1 (en) * 2011-11-04 2013-05-09 The Regents Of The University Of California Data fusion and mutual calibration for a sensor network and a vision system
CN111754578A (zh) * 2019-03-26 2020-10-09 舜宇光学(浙江)研究院有限公司 一种用于激光雷达和相机的联合标定方法及其系统和电子设备
CN112818990A (zh) * 2021-01-29 2021-05-18 中国人民解放军军事科学院国防科技创新研究院 目标检测框的生成方法、图像数据自动标注方法及系统
CN112907727A (zh) * 2021-01-25 2021-06-04 中国科学院空天信息创新研究院 相对变换矩阵的标定方法、装置及系统
CN113538591A (zh) * 2021-06-18 2021-10-22 深圳奥锐达科技有限公司 一种距离测量装置与相机融合系统的标定方法及装置
CN113538592A (zh) * 2021-06-18 2021-10-22 深圳奥锐达科技有限公司 一种距离测量装置与相机融合系统的标定方法及装置
CN216133412U (zh) * 2021-06-18 2022-03-25 深圳奥锐达科技有限公司 一种距离测量装置与相机融合系统

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108828606B (zh) * 2018-03-22 2019-04-30 中国科学院西安光学精密机械研究所 一种基于激光雷达和双目可见光相机联合测量方法
CN110021046B (zh) * 2019-03-05 2021-11-19 中国科学院计算技术研究所 相机与激光雷达组合传感器的外参数标定方法及系统
CN109949372B (zh) * 2019-03-18 2021-12-10 北京智行者科技有限公司 一种激光雷达与视觉联合标定方法
CN110148180B (zh) * 2019-04-22 2021-06-08 河海大学 一种激光雷达与相机融合装置与标定方法
CN110161485B (zh) * 2019-06-13 2021-03-26 同济大学 一种激光雷达与视觉相机的外参标定装置
CN110390695B (zh) * 2019-06-28 2023-05-23 东南大学 一种基于ros的激光雷达、摄像头的融合标定系统及标定方法
CN111127563A (zh) * 2019-12-18 2020-05-08 北京万集科技股份有限公司 联合标定方法、装置、电子设备及存储介质
CN111366912B (zh) * 2020-03-10 2021-03-16 上海西井信息科技有限公司 激光传感器与摄像头标定方法、系统、设备及存储介质
CN111754583B (zh) * 2020-05-29 2024-05-14 江苏大学 一种车载三维激光雷达和相机外参联合标定的自动化方法
CN111964694B (zh) * 2020-08-19 2023-11-17 天津大学 一种用于三维测量的激光测距仪标定方法
CN112446927A (zh) * 2020-12-18 2021-03-05 广东电网有限责任公司 激光雷达和相机的联合标定方法、装置、设备及存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130113704A1 (en) * 2011-11-04 2013-05-09 The Regents Of The University Of California Data fusion and mutual calibration for a sensor network and a vision system
CN111754578A (zh) * 2019-03-26 2020-10-09 舜宇光学(浙江)研究院有限公司 一种用于激光雷达和相机的联合标定方法及其系统和电子设备
CN112907727A (zh) * 2021-01-25 2021-06-04 中国科学院空天信息创新研究院 相对变换矩阵的标定方法、装置及系统
CN112818990A (zh) * 2021-01-29 2021-05-18 中国人民解放军军事科学院国防科技创新研究院 目标检测框的生成方法、图像数据自动标注方法及系统
CN113538591A (zh) * 2021-06-18 2021-10-22 深圳奥锐达科技有限公司 一种距离测量装置与相机融合系统的标定方法及装置
CN113538592A (zh) * 2021-06-18 2021-10-22 深圳奥锐达科技有限公司 一种距离测量装置与相机融合系统的标定方法及装置
CN216133412U (zh) * 2021-06-18 2022-03-25 深圳奥锐达科技有限公司 一种距离测量装置与相机融合系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
CHEN SHOUBIN; LIU JINGBIN; LIANG XINLIAN; ZHANG SHUMING; HYYPPA JUHA; CHEN RUIZHI: "A Novel Calibration Method between a Camera and a 3D LiDAR with Infrared Images", 2020 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 31 May 2020 (2020-05-31), pages 4963 - 4969, XP033825846, DOI: 10.1109/ICRA40945.2020.9196512 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116336964A (zh) * 2023-05-31 2023-06-27 天津宜科自动化股份有限公司 一种物体轮廓信息的获取系统
CN116336964B (zh) * 2023-05-31 2023-09-19 天津宜科自动化股份有限公司 一种物体轮廓信息的获取系统
CN116485917A (zh) * 2023-06-19 2023-07-25 擎翌(上海)智能科技有限公司 拍摄装置与雷达装置的联合标定方法、系统、设备及介质
CN116485917B (zh) * 2023-06-19 2023-09-22 擎翌(上海)智能科技有限公司 拍摄装置与雷达装置的联合标定方法、系统、设备及介质
CN116485918A (zh) * 2023-06-25 2023-07-25 天府兴隆湖实验室 一种标定方法、系统及计算机可读存储介质
CN116485918B (zh) * 2023-06-25 2023-09-08 天府兴隆湖实验室 一种标定方法、系统及计算机可读存储介质
CN116934871A (zh) * 2023-07-27 2023-10-24 湖南视比特机器人有限公司 一种基于标定物的多目系统标定方法、系统及存储介质
CN116934871B (zh) * 2023-07-27 2024-03-26 湖南视比特机器人有限公司 一种基于标定物的多目系统标定方法、系统及存储介质
CN117781854A (zh) * 2023-09-22 2024-03-29 深圳市创客工场科技有限公司 空间测量方法、数控机器及计算机可读存储介质
CN117315011A (zh) * 2023-11-30 2023-12-29 吉林珩辉光电科技有限公司 一种大气湍流中光斑中心定位方法及装置
CN117315011B (zh) * 2023-11-30 2024-04-02 吉林珩辉光电科技有限公司 一种大气湍流中光斑中心定位方法及装置
CN117788781A (zh) * 2024-02-28 2024-03-29 深圳市易检车服科技有限公司 标定物识别方法、装置、电子设备及存储介质
CN117788781B (zh) * 2024-02-28 2024-06-07 深圳市易检车服科技有限公司 标定物识别方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
CN113538591A (zh) 2021-10-22
CN113538591B (zh) 2024-03-12

Similar Documents

Publication Publication Date Title
WO2022262332A1 (fr) Procédé et appareil d'étalonnage pour système de fusion d'un dispositif de mesure de distance et d'une caméra
CN113538592B (zh) 一种距离测量装置与相机融合系统的标定方法及装置
CN110596721B (zh) 双重共享tdc电路的飞行时间距离测量系统及测量方法
CN110596722B (zh) 直方图可调的飞行时间距离测量系统及测量方法
CN110596725B (zh) 基于插值的飞行时间测量方法及测量系统
CN110596724B (zh) 动态直方图绘制飞行时间距离测量方法及测量系统
CN110596723B (zh) 动态直方图绘制飞行时间距离测量方法及测量系统
WO2021072802A1 (fr) Système et procédé de mesure de distance
WO2023015880A1 (fr) Procédé d'acquisition d'un ensemble d'échantillons d'apprentissage, procédé de formation de modèle et appareil associé
CN107860337B (zh) 基于阵列相机的结构光三维重建方法与装置
WO2019076072A1 (fr) Procédé et appareil de mesure de distance optique
CN110603461B (zh) 飞行时间设备
WO2023103198A1 (fr) Procédé et dispositif de calcul de paramètres extrinsèques relatifs d'un système de télémétrie, et support de stockage
EP4047386A1 (fr) Appareil de détection de profondeur et dispositif électronique
CN111965658B (zh) 一种距离测量系统、方法及计算机可读存储介质
CN211148917U (zh) 一种距离测量系统
KR20190000052A (ko) 광 송출장치 및 이를 이용한 ToF(Time of Flight)모듈
CN110986816B (zh) 一种深度测量系统及其测量方法
CN110780312A (zh) 一种可调距离测量系统及方法
CN216133412U (zh) 一种距离测量装置与相机融合系统
CN116485862A (zh) 一种深度数据标定校准方法与装置
CN212471510U (zh) 移动机器人
CN111982071B (zh) 一种基于tof相机的3d扫描方法及系统
CN112750165A (zh) 参数标定方法、智能驾驶方法及其装置、设备和存储介质
CN117111046B (zh) 畸变矫正方法、系统、设备及计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22823823

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22823823

Country of ref document: EP

Kind code of ref document: A1