CN113538592B - Calibration method and device for distance measuring device and camera fusion system - Google Patents

Calibration method and device for distance measuring device and camera fusion system Download PDF

Info

Publication number
CN113538592B
CN113538592B CN202110680076.3A CN202110680076A CN113538592B CN 113538592 B CN113538592 B CN 113538592B CN 202110680076 A CN202110680076 A CN 202110680076A CN 113538592 B CN113538592 B CN 113538592B
Authority
CN
China
Prior art keywords
camera
spot
target image
distance measuring
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110680076.3A
Other languages
Chinese (zh)
Other versions
CN113538592A (en
Inventor
刘浏
陈首彬
陈文胜
姜佑其
刘贤焯
闫敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Oradar Technology Co Ltd
Original Assignee
Shenzhen Oradar Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Oradar Technology Co Ltd filed Critical Shenzhen Oradar Technology Co Ltd
Priority to CN202110680076.3A priority Critical patent/CN113538592B/en
Publication of CN113538592A publication Critical patent/CN113538592A/en
Application granted granted Critical
Publication of CN113538592B publication Critical patent/CN113538592B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/03Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring coordinates of points
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The application is suitable for the technical field of distance measurement, and provides an online calibration method of a distance measurement device and camera fusion system, which comprises the following steps: controlling a distance measuring device to project a spot light beam to a target scene and collect the spot light beam to obtain a first target image, and synchronously controlling a camera to collect the target scene to obtain a second target image; acquiring three-dimensional coordinate information of a spot corresponding to the spot light beam under a world coordinate system according to the first target image; acquiring two-dimensional coordinate information of a spot corresponding to the spot light beam under a pixel coordinate system according to the second target image; determining a plurality of pairs of initial point pairs according to a preset projection rule; obtaining the matching confidence of each initial point pair; and taking the initial point pair with the matching confidence degree meeting the preset condition as a target point pair, and calculating the external parameters between the distance measuring device and the camera according to the target point pair. Real-time and high-precision online calibration is realized, and the robustness and adaptability of the self-calibration scheme are improved.

Description

Calibration method and device for distance measuring device and camera fusion system
Technical Field
The application belongs to the technical field of distance measurement, and particularly relates to a calibration method and device of a distance measurement device and camera fusion system.
Background
For the current mature intelligent perception scheme, particularly for an automatic driving system above L4, the requirements on the perception capability are diversified, and the vehicle and the environment are accurately, real-time, comprehensively and reliably perceived by utilizing the fusion of a distance measuring device, a camera and other multiple sensors. Wherein the distance measuring device comprises a depth camera or a laser radar (LiDAR, light Detection and Ranging) based on the time of flight principle for acquiring three-dimensional data of the target. In the fusion system of the distance measuring device and the camera, the camera can provide rich visual texture information to compensate for short plates in recognition and cognition due to low resolution of the distance measuring device; meanwhile, the direct 3D geometric measurement of the distance measuring device can also make up for the deficiency of the camera in the aspect of depth estimation, and more accurate depth information is provided.
In a system for fusing a distance measuring device and a camera, the problem to be solved primarily is how to calibrate the data of different sensors into the same coordinate system, and the high-precision calibration of the distance measuring device and the camera fusion system is the basis and premise of data fusion processing. Usually, the parameter calibration is carried out on the distance measuring device and the camera fusion system before delivery, but in the actual use process, the internal and external parameters of the system are changed due to the influences of factors such as temperature, mechanical collision and aging, so that the calibration accuracy of the system is reduced, the condition of misalignment among multiple sensors is caused, the accuracy of subsequent applications such as sensing, three-dimensional reconstruction and the like is obviously influenced, and then the internal and external parameters are required to be calibrated again by online calibration. However, the existing online calibration algorithm is poor in accuracy and robustness, so that only offline calibration can be carried out in factory return under the condition, the existing offline calibration algorithm has high requirements on scenes and targets, actual operation is complex, and labor cost and time cost required by the whole process are high.
Disclosure of Invention
The embodiment of the application provides a calibration method and device of a distance measuring device and camera fusion system, which can solve the problems.
In a first aspect, an embodiment of the present application provides a calibration method for a fusion system of a distance measurement device and a camera, including:
controlling the distance measuring device to project a spot light beam to a target scene and acquire the spot light beam to obtain a first target image, and synchronously controlling the camera to acquire the target scene to obtain a second target image; the second target image is an imaging result in an infrared band;
acquiring three-dimensional coordinate information of a spot corresponding to the spot beam under a world coordinate system according to the first target image;
acquiring two-dimensional coordinate information of a spot corresponding to the spot beam under a pixel coordinate system according to the second target image;
determining a plurality of pairs of initial point pairs according to a preset projection rule, wherein the initial point pairs comprise the three-dimensional coordinate information and the two-dimensional coordinate information corresponding to the light spots;
obtaining the matching confidence of each initial point pair;
and taking the initial point pair with the matching confidence degree meeting a preset condition as a target point pair, and calculating an external parameter between the distance measuring device and the camera according to the target point pair.
Further, the obtaining, according to the first target image, three-dimensional coordinate information of the spot corresponding to the spot beam in the world coordinate system includes:
acquiring a first coordinate of a spot corresponding to the spot light beam under a pixel coordinate system according to the first target image;
obtaining a distance measurement value corresponding to the spot light beam;
and calculating three-dimensional coordinate information of the light spot corresponding to the spot light beam under a world coordinate system according to the internal reference of the distance measuring device, the first coordinate and the distance measurement value.
Further, before the determining the pairs of initial points according to the preset projection rule, the method further comprises:
calculating the parallax value of the light spot corresponding to the spot light beam according to the distance measurement value,
correcting the spatial distribution of the light spots in the second target image according to the parallax value;
and determining a plurality of pairs of initial points by matching the corrected second target image and the first target image.
Further, the predetermined projection rule determines a plurality of pairs of initial points, including:
the preset projection rule is a time coding rule, and the three-dimensional coordinate information and the two-dimensional coordinate information corresponding to the same moment are matched to obtain a plurality of pairs of initial point pairs;
The preset projection rule is a space coding rule, and the space distribution of the light spots in the first target image and the second target image is matched to obtain a plurality of pairs of initial point pairs. Further, the calculating the external parameter between the distance measuring device and the camera according to the target point pair includes:
constructing a reprojection error function model;
and carrying out iterative computation according to the target point pair and the re-projection error function model to obtain the minimum re-projection error and the external parameters between the distance measuring device and the camera.
Further, after performing iterative computation according to the target point pair and the re-projection error function model to obtain a minimum re-projection error and an external parameter between the distance measurement device and the camera, the method further includes:
and optimizing a first internal parameter initial value of a depth camera of the distance measuring device and a second internal parameter initial value of a camera in the camera fusion system to obtain an optimized first target internal parameter of the distance measuring device and an optimized second target internal parameter of the camera.
Further, after the obtaining of the matching confidence of each initial point pair, the method further comprises the steps of;
Determining overall confidence according to the matching confidence of each initial point pair;
and if the overall confidence coefficient is smaller than a preset threshold value, judging that the working state is abnormal, and stopping calculating the external parameters between the distance measuring device and the camera.
In a second aspect, an embodiment of the present application provides a calibration device for a fusion system of a distance measurement device and a camera, including:
the control unit is used for controlling the distance measuring device to project a spot light beam to a target scene and collect the spot light beam to obtain a first target image, and synchronously controlling the camera to collect the target scene to obtain a second target image; the second target image is an imaging result in an infrared band;
the first acquisition unit is used for acquiring three-dimensional coordinate information of the light spot corresponding to the spot light beam under a world coordinate system according to the first target image;
the second acquisition unit is used for acquiring two-dimensional coordinate information of the light spot corresponding to the spot light beam under a pixel coordinate system according to the second target image;
the determining unit is used for determining a plurality of pairs of initial point pairs according to a preset projection rule, wherein the initial point pairs comprise the three-dimensional coordinate information and the two-dimensional coordinate information corresponding to the light spots;
A third obtaining unit, configured to obtain a matching confidence of each initial point pair;
the first processing unit is used for taking an initial point pair with the matching confidence degree meeting a preset condition as a target point pair and calculating an external parameter between the distance measuring device and the camera according to the target point pair.
In a third aspect, an embodiment of the present application provides a calibration device apparatus for a distance measurement device and camera fusion system, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the calibration method for a distance measurement device and camera fusion system according to the first aspect when executing the computer program.
In a fourth aspect, an embodiment of the present application provides a computer readable storage medium, where a computer program is stored, where the computer program when executed by a processor implements a calibration method of a distance measurement device and camera fusion system according to the first aspect.
In the embodiment of the application, a distance measuring device is controlled to project a spot light beam to a target scene and collect the spot light beam to obtain a first target image, and a camera is synchronously controlled to collect the target scene to obtain a second target image; acquiring three-dimensional coordinate information of a spot corresponding to the spot light beam under a world coordinate system according to the first target image; acquiring two-dimensional coordinate information of a spot corresponding to the spot light beam under a pixel coordinate system according to the second target image; determining a plurality of pairs of initial point pairs according to a preset projection rule, wherein the initial point pairs comprise the three-dimensional coordinate information and the two-dimensional coordinate information corresponding to the light spots; obtaining the matching confidence of each initial point pair; and taking the initial point pair with the matching confidence degree meeting the preset condition as a target point pair, and calculating external parameters between the distance measuring device and the camera according to the target point pair. According to the self-calibration scheme, special calibration equipment and calibration objects are not needed, under a complex use scene, the distance measurement device is modulated to emit spot light beams according to the preset coding rule, and the spot light beams projected to a target scene by the distance measurement device are imaged synchronously through the camera with infrared sensing light capability, so that real-time and online calibration and calibration of the distance measurement device and the camera are realized, limitation of resolution of the distance measurement device on parameter calibration precision is broken through, labor cost and time cost of calibration are reduced, accurate matching of 3D data and 2D data is guaranteed, and robustness and adaptability of the self-calibration scheme are improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of a calibration method of a distance measuring device and camera fusion system according to a first embodiment of the present application;
FIG. 2 is a schematic diagram of a spatial coding rule in a calibration method of a distance measurement device and camera fusion system according to a first embodiment of the present application;
FIG. 3 is a schematic diagram of another spatial coding rule in a calibration method of a distance measurement device and camera fusion system according to a first embodiment of the present application;
FIG. 4 is a schematic diagram of a calibration device of a fusion system of a distance measuring device and a camera according to a second embodiment of the present application;
fig. 5 is a schematic diagram of a calibration device of a fusion system of a distance measuring device and a camera according to a third embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in the present description and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
Furthermore, the terms "first," "second," "third," and the like in the description of the present specification and in the appended claims, are used for distinguishing between descriptions and not necessarily for indicating or implying a relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
Referring to fig. 1, fig. 1 is a schematic flowchart of a calibration method of a distance measuring device and camera fusion system according to a first embodiment of the present application. In this embodiment, an execution body of a calibration method of a distance measurement device and a camera fusion system is a device having a calibration function of the distance measurement device and the camera fusion system.
Before describing the calibration method of the distance measuring device and the camera fusion system in detail, the distance measuring device and the camera fusion system will be described.
The distance measuring device comprises a ToF depth sensor/LiDAR for projecting a spot beam towards a target scene and collecting the spot beam reflected by the target scene, calculating the time of flight of the spot beam from the emission to the collection to obtain a depth image of the target scene.
In the distance measuring device, a LiDAR or a depth camera in a fixed array emission mode is adopted, and the distance measuring device can be an area array emission type LiDAR or a mechanical scanning type LiDAR or can be any one of depth cameras based on a time-of-flight principle (including DTOF and ITOF), and an emitter of the distance measuring device comprises at least one light source for projecting spot light beams.
In particular, the distance measuring device comprises a transmitter, a collector, and a control and processing circuit.
The emitter comprises a light source, an emitting optical element, etc. In some embodiments, a beam splitting element is also included. For the light source, the light source may be a single light source or an array of light sources composed of a plurality of light sources, wherein the array of light sources may be configured to emit light in groups, and divided into a plurality of sub-light source arrays, and each sub-light source array may be a row or a column of light sources, or may be any other form. Only one array of sub-light sources or only one light source in each array of sub-light sources may be turned on at a time when the emitter is controlled to emit a spot beam to produce a fixed spot array exit projection onto the target surface.
A typical example is a light source configured as a VCSEL (Vertical-Cavity Surface-Emitting Laser) array light source, which performs array emission by column addressing or two-bit addressing, and which is projected onto a target Surface in a fixed spot array after modulation by an emission optical element composed of a single or multiple lenses. As yet another typical example, the light source may emit a spot beam using an EEL (Edge-emitting Laser) or a VCSEL, and the emission optical element includes a collimating lens and a beam splitting element, and after passing through the emission optical element, the beam is optically collimated and split by the beam splitting element, and the beam splitting element also generates a fixed spot array to be projected onto the object surface, and the beam splitting element may be a diffraction light source element (Difractive Optical Element, DOE), a microlens array, or the like.
The collector comprises a pixel unit consisting of at least one pixel, a filtering unit and a receiving optical element, wherein the receiving optical element images a spot light beam reflected by a target onto a pixel array, the filtering unit is used for filtering out background light and stray light, and the pixel can be one of a APD, siPM, SPAD, CCD, CMOS photoelectric detector and the like. In some embodiments, the pixel unit is an image sensor dedicated to light time-of-flight measurement, and the pixel unit may also be integrated into a light-sensitive chip dedicated to light time-of-flight measurement. In an exemplary embodiment, the pixel cell includes a plurality of SPADs that are responsive to an incident single photon and output a photon signal indicative of the respective arrival time of the received photon at each SPAD. Typically, the collector further includes a readout circuit comprising one or more of a signal amplifier, a time-to-digital converter (TDC), a digital-to-analog converter (ADC), etc. connected to the pixel unit (these circuits may be integrated with the pixel as part of the collector or may be part of the control and processing circuit).
The control and processing circuitry may be separate dedicated circuitry, such as separate circuitry with computing capabilities of the depth camera itself; general purpose processing circuitry may also be included, such as when the depth camera is integrated into a smart terminal, e.g., a cell phone, television, computer, etc., where a processor in the terminal may perform the functions of the control and processing circuitry. The control and processing circuitry simultaneously controls the emitter and collector and calculates the depth of the target based on the time or phase difference between the emitted and reflected beams. In the present invention, for ease of description, the control and processing circuitry is included as part of a device having the calibration functionality of a distance measuring device and camera fusion system.
For the measurement principle in which the control and processing circuit performs depth calculations, the time of flight t is typically calculated by calculating the difference between the moment of emission and the moment of reception of the pulse by the direct (dtofs) time of flight method, and the object distance is further calculated according to the formula d=ct/2. In addition, the time of flight may be solved by an indirect (IToF) time of flight method, by solving the phase information of the transmitted waveform, or by transmitting a modulated and encoded continuous wave signal, and the receiving end may solve the time of flight indirectly by a signal processing method such as correlation matching, for example: the implementation of the scheme is not affected by the different ranging schemes, such as the AMCW amplitude modulated continuous wave, the FMCW frequency modulated continuous wave, the coded pulse transmission and the like.
The camera in the fusion system is a high-resolution imaging camera, the high-resolution imaging camera needs to ensure strict time synchronization with the distance measuring device, and in the calibration process, clear imaging of point-by-point or array light spots projected by laser under a near infrared band needs to be ensured. In the use process of the actual product, visual perception of visible light or near infrared light can be selected according to the actual use scene, and the fusion perception effect of the camera and the distance measuring device is realized. The fusion system also comprises a processor, a camera and a camera, wherein the processor is used for receiving the depth image and the visible light image to perform three-dimensional measurement; or, the receiving depth image and the visible light image perform parameter calibration of the system, and preferably, the processor is integrated into the device with the calibration function.
In an exemplary embodiment, the camera includes an RGB-IR sensor, that is, different filters are used on different pixel surfaces, so that the camera can image a scene in both visible light and near infrared bands, and when performing online calibration, the camera can image a spot beam emitted by the light source according to the imaging characteristic of the near infrared band to obtain a second target image for subsequent parameter calibration.
In another exemplary embodiment, the camera includes a multi-filter switcher, which can switch different cut-off filters according to actual use situations and functional requirements, thereby ensuring the imaging reliability of the camera and ensuring the imaging of spot light beams during on-line calibration. Preferably, the multi-filter switcher includes a visible light band pass filter, an infrared band pass filter, and a visible light near infrared filter. When the fusion system is in a three-dimensional measurement working mode in the daytime or in a strong light environment, the band-pass filter of the visible light wave band works, only visible light rays are allowed to enter at the moment, the camera is used for collecting visible light images of a target scene, interference of infrared light rays emitted by the distance measuring device on imaging of the camera is avoided, and the camera can restore real visible light colors. When the on-line calibration process is started, the infrared cut-off filter automatically moves away, the infrared band-pass filter starts to work, the wavelength gating range of the infrared band-pass filter is consistent with the wavelength of the light beam projected by the distance measuring device, only infrared light is allowed to enter at the moment, interference of visible light on infrared imaging in the calibration process is avoided, and at the moment, the camera collects infrared light images of a target scene. And when the fusion system is in a night environment or a dim light environment for a three-dimensional measurement mode, because the visible light image with clear target scene cannot be obtained when the illuminance is low at this time, the visible light band-pass filter and the infrared band-pass filter are automatically removed, and the visible light near-infrared filter starts to work, so that all light rays are fully utilized when a camera images the target scene, and the imaging performance of the camera in low illuminance is improved.
The calibration method of the distance measuring device and the camera fusion system shown in fig. 1 may include:
s101: controlling the distance measuring device to project a spot light beam to a target scene and acquire the spot light beam to obtain a first target image, and synchronously controlling the camera to acquire the target scene to obtain a second target image; the second target image is an imaging result under an infrared band.
The device controls an emitter in the distance measuring device to project a spot beam to the target scene and acquires the spot beam through the collector to obtain a first target image. And synchronously controlling the camera to acquire a target scene to obtain a second target image. It should be noted that, because the fusion system performs online calibration at this time, the camera needs to acquire an infrared image of the target scene, and when the camera is configured as an RGB-IR sensor, a visible light image and an infrared image (a second target image) of the target scene can be output at the same time; when the camera is configured to include the multi-filter switcher, the multi-filter switcher can be controlled to switch to the infrared band-pass filter before self-calibration, only infrared band gating is specified, and interference to calibration caused by ambient light is avoided, and typical choices are as follows: the specific wave band selection needs to be matched with the wave band of the optical signal emitted by the emitter in the infrared wave band ranges of 850nm, 905nm, 1550nm and the like. In this embodiment, when the fusion system is self-calibrated, a calibration object is not required to be set, the device controls the emitter in the distance measurement device to project a spot beam towards any object in the target scene, and controls the distance measurement device and the camera to collect the projected spot and perform self-calibration according to the collected image.
Distance measurement devices for fixed array emissions, including LiDAR or depth cameras, where the light source may be a single light source or an array of light sources consisting of multiple light sources, the array of light sources may be configured to emit light in groups for projecting spot beams. When projection is carried out, projection can be carried out according to a preset projection rule, and the accuracy of light spot position matching can be improved through the preset projection rule, so that mismatching caused by adjacent light spots is avoided. The preset projection rules may include spatial encoding rules and temporal encoding rules.
The time coding rule refers to that in the calibration process, the transmitter is controlled to transmit spot light beams according to a time coding sequence, so that the accurate matching of light spot pairs is facilitated. The time coding rule refers to the sequence of controlling the light source to be started in the calibration process. Preferably, the control emitter projects only one spot beam at a time, and the emission sequence can be arranged according to a certain sequence or can be randomly arranged.
The spatial coding rule means that in the calibration process, the light source array may be configured to include a plurality of sub-light source arrays, where the sub-light source arrays may be a row or a column of light sources, or any other form, and only one sub-light source array is turned on at a time or only one light source in each sub-light source array is turned on. For example, in one embodiment, by controlling the light source to emit light in a one-dimensional row-by-row/column manner, as shown in fig. 2, light spots are sequentially emitted in columns from left to right, and only one column of light spots is guaranteed to be projected and imaged by the camera at the same time. In another embodiment, the coded emission is implemented according to spatial partitioning, such as the top left corner of fig. 3 is a typical sub-block, and the coded scanning emission is implemented according to the direction indicated by the arrow, and other sub-block emissions are consistent with the sub-block, so that each sub-block is only projected and imaged with one spot at the same time. The block size design and arrow indication are only understood, not the fixed requirement of the scheme, and certain adjustment can be performed in the related scheme. When the space coding rule is adopted, multiple pairs of target point pairs can be obtained through single projection, meanwhile, the robustness of light spot matching can be improved according to the projection rule, the space positions among different light spots can be effectively controlled, interference avoidance is realized, the probability of mismatching is reduced or eliminated, and therefore the calibration precision is improved.
In order to improve the robustness of the spot matching, the time coding and space coding rules mentioned in the above scheme can be combined in different forms for use in specific products and schemes, and all the methods should belong to the protection scope of the patent.
The device synchronously controls a camera in the camera fusion system to acquire a target scene to obtain a second target image while controlling the distance measuring device to project a spot light beam to the target scene and acquire the spot light beam to obtain a first target image. In order to ensure the accuracy of the calibration, a strict guarantee is required for the time synchronization with the acquisition of the spot beam to obtain the first target image when the second target image is acquired.
S102: and acquiring three-dimensional coordinate information of the light spot corresponding to the spot light beam under a world coordinate system according to the first target image.
And the device acquires three-dimensional coordinate information of the light spot corresponding to the spot light beam under the world coordinate system according to the first target image. The first target image is acquired by the collector, coordinates of the light spots in a pixel coordinate system can be obtained from the first target image, and after depth values corresponding to the light spots are obtained, three-dimensional coordinate information of the light spots in a world coordinate system can be calculated according to internal parameters of the distance measuring device.
Specifically, the device acquires a first coordinate of a spot corresponding to the spot beam in a pixel coordinate system according to a first target image; the ith light source in the emitter projects onto the calibration plate to form the ith light spot, the coordinates of the ith light source are (x, y) in the original point coordinate system established by taking the upper left corner of the light source array as the origin, and the first coordinate of the light spot corresponding to the ith light spot beam in the pixel coordinate system can be considered as (x) due to the optical conjugation of the emitter and the collector in the distance measuring device i ,y i ). The number i of the light spot projected onto the target scene can be known by controlling the turn-on sequence of the light sources.
In another mode, the collector can also be used as a camera, the light spot can be imaged on the collector, and the coordinates of the light spot under the pixel coordinate system can be determined according to the imaging position of the light spot.
Then, the device acquires a distance measurement value D of the light spot corresponding to the spot light beam, and calculates three-dimensional coordinate information of the light spot corresponding to the spot light beam according to the internal reference, the first coordinate and the distance measurement value of the distance measurement device. Internal reference K of a distance measuring device is stored in advance in a device D ComprisesAnd the variables such as focal length, distortion parameters, pixel offset and the like can be used for calculating the three-dimensional coordinate information of the light spot corresponding to the ith light spot beam under the world coordinate system.
Taking one spot as an example, specifically, the first coordinate of the spot corresponding to the ith spot beam in the pixel coordinate system is (x i ,y i ),Then the ideal coordinate (X 'of the light spot corresponding to the ith light spot beam in the world coordinate system is obtained' Wi ,Y Wi ,Z′ Wi ) By means of distortion parameter theta d Performing distortion adding operation to obtain a spatial coordinate P of the light spot in a world coordinate system Wi (X Wi ,Y Wi ,Z Wi ) I.e. the three-dimensional coordinate information of the spot corresponding to the spot beam. Wherein,,f d for the focal length of the distance measuring device, D is the distance measurement value corresponding to the ith spot, i is the spot label, i=1, 2, …, n.
S103: and acquiring two-dimensional coordinate information of the light spot corresponding to the spot light beam under a pixel coordinate system according to the second target image.
The device obtains the two-dimensional coordinate information of the light spot corresponding to the spot light beam according to the second target image, that is, the device can directly determine the two-dimensional coordinate information according to the imaging position of the light spot corresponding to the spot light beam in the second target image. The device may mark the coordinates of the ith spot in the pixel coordinate system as p i (u i ,v i )。
In the embodiment of the application, because the distance between the target object in the target scene is different and a certain distance exists between the distance measuring device and the camera, the imaging position of the spot light beam reflected back to the target object with different distances is offset for the spot light beam emitted by the same light source, namely parallax is influenced, when the spot light beam is emitted according to a preset coding rule and used for matching, the camera collects the spot light beam, the spot position is offset due to the influence of parallax, for example, a row of light sources are started to emit the spot light beam to irradiate the target object in one measurement, a row of light spots are formed on the target object, the parallax is influenced, the arrangement form of the light spots collected by the camera is different from the arrangement form of the light spots projected on the object, and the subsequent matching cannot be performed accurately, and the situation that the matching between the spot light collected by the distance measuring device and the spot light collected by the camera determines the target point is possible to be mismatched is generated. Therefore, in one embodiment, the two-dimensional coordinates of the light spot collected by the camera under the pixel coordinate system need to be corrected, namely, the spatial distribution of the light spot in the second target image is corrected, and then the matching of the subsequent steps is performed to determine the corresponding target point pair, so that the matching accuracy is improved, meanwhile, the corresponding hardware system can be adapted, the requirements of the scheme on the calibration scene and the target object are reduced, and the applicability of the scheme on the scene is greatly improved.
During correction, the device acquires a distance measurement value D of a light spot corresponding to the spot light beam according to the first target image, and calculates a parallax value of the light spot corresponding to the spot light beam according to a system base line (a mode of a T vector in an external parameter calibration result) and an equivalent imaging focal length of the system; correcting the initial coordinate information of the light spot in the second target image according to the parallax value to obtain two-dimensional coordinate information of the light spot corresponding to the light spot light beam in the pixel coordinate system, wherein the parallax value is the offset of the initial coordinate of the light spot light beam along the baseline direction, and the baseline direction is set as the x direction of the pixel coordinate system.
Specifically, given the system baseline B and the equivalent imaging focal length f, the parallax value d corresponding to the light spot can be calculated, and the calculation formula of the parallax value is as follows:
and correcting the initial coordinate information of each light spot in the second target image according to the calculated parallax value d to obtain the spatial distribution of the corrected light spots in the second target image.
S104: and determining a plurality of pairs of initial point pairs according to a preset projection rule, wherein the initial point pairs comprise the three-dimensional coordinate information and the two-dimensional coordinate information corresponding to the light spots.
The device can match the three-dimensional coordinate information with the two-dimensional coordinate information according to a preset projection rule to obtain a plurality of pairs of initial point pairs. The details of the preset projection rule in S101 are already described in detail, and are not repeated here, and the pairing mode is not limited in this embodiment when pairing. For example, when the preset projection rule is a time coding rule, for example, in one implementation manner, the device controls the emitter to project only one spot light beam at a time, and the device can match the three-dimensional coordinate information and the two-dimensional coordinate information corresponding to the same time, so as to obtain an initial point pair, and the pairing of the three-dimensional coordinate information and the two-dimensional coordinate information is completed.
In another implementation manner, when the preset projection rule is a space coding manner, the imaging positions and the arrangement manners of the light spots in the first target image and the second target image are combined for matching, so that the pairing of the three-dimensional coordinate information and the two-dimensional coordinate information is realized. When the spatial coding mode is selected, parallax correction is required to be performed on the light spot in the second target image, and matching is performed on the corrected second target image and the corrected first target image to achieve matching of the three-dimensional coordinate information and the two-dimensional coordinate information.
It will be appreciated that in practical applications, the time encoding rule and the space encoding rule are combined together in any form to modulate the spot beam projected by the emitter into the target object, and no matter how the spot pattern of a certain mode can be formed on the target object if combined, matching can be achieved according to the arrangement form and the position of the spots in the first target image and the second target image.
S105: and obtaining the matching confidence of each initial point pair.
The equipment acquires information such as the matching re-projection error, the light spot imaging contrast, the light spot size and the like of each initial point pair, and calculates the matching confidence coefficient of the initial point pair according to the information such as the matching re-projection error, the light spot imaging contrast, the light spot size and the like.
The matching reprojection error can be calculated according to a reprojection error function model preset in the equipment in a factory.
If the internal parameters need to be optimized, the reprojection error function model can be:
wherein K is D K is an internal parameter of the distance measuring device and comprises variables such as focal length, distortion parameters, pixel offset and the like C As internal parameters of the camera, the rotation matrix R and the translation matrix T are external parameters, and Tof i Is the depth measurement.
In some embodiments, if the distance measurement device is fixed and is only used for calibrating the external parameter, the preset re-projection error function model may be:
Wherein the coordinate mark of the ith light spot in the pixel coordinate system is p i (u i ,v i ) The rotation matrix R and the translation matrix T are external parameters.
Further, after calculating the matching confidence coefficient of each initial point pair, the device may determine the overall confidence coefficient according to the matching confidence coefficient of each initial point pair; if the overall confidence is smaller than a preset threshold, the working abnormality of the current sensor or the larger deviation of the external parameters between the sensors is indicated, the risk of on-line calibration is higher, the follow-up self-calibration step cannot be realized, the working state abnormality is judged, and the calculation of the external parameters between the distance measuring device and the camera is stopped. And the abnormal signal can be sent to a superior control system at the same time, and manual intervention processing is suggested.
S106: and taking the initial point pair with the matching confidence degree meeting a preset condition as a target point pair, and calculating an external parameter between the distance measuring device and the camera according to the target point pair.
The device screens the initial point pairs according to the matching confidence, and marks the initial point pairs meeting the preset conditions as target point pairs.
The device calculates the external parameters between the distance measuring device and the camera according to all the target point pairs, wherein the three-dimensional coordinate information and the two-dimensional coordinate information in each group of target point pairs follow the same geometric rule. The device may construct a correspondence between the three-dimensional coordinate information and the two-dimensional coordinate information, the correspondence including an external parameter between the distance measuring apparatus and the camera. The specific correspondence is as follows:
[u i ,v i ,1] T =K c [R,t][X Wi ,Y Wi ,Z Wi ,1] T
Wherein, [ R, t ] is the external parameter between the distance measuring device and the camera
After a plurality of pairs of target point pairs conforming to the corresponding relation are definitely obtained, the external parameters between the distance measuring device and the camera can be accurately calculated through iterative calculation, and the calibration of the external parameters is completed.
During calculation, a PnP algorithm can be adopted, and the PnP camera pose estimation algorithm solves a rotation matrix R and a translation matrix T by combining imaging of the feature points with known coordinates in a camera, namely solves an external parameter.
The matching weight projection errors are calculated, iterative calculation is carried out according to the target point pairs and the re-projection error function model, a new external parameter is obtained in each calculation, the re-projection errors are calculated through the new external parameter, the minimum value is screened out from all the re-projection errors and is used as the minimum re-projection error, and the external parameter corresponding to the minimum re-projection error is used as the external parameter between the finally solved optimal distance measuring device and the camera.
Furthermore, the device may use the calculated minimum re-projection error as a quantitative evaluation criterion for the calibration accuracy.
Further, as can be seen from step S105, the matching re-projection error model may be set to a function model including only the external parameters or may also be set to a function model including the internal parameters and the external parameters, and when the matching re-projection error model is set to a function model including the internal parameters and the external parameters, the device may optimize the first internal parameter initial value of the distance measurement device and the second internal parameter initial value of the camera while calculating the external parameters through multiple iterative computations, so as to obtain an optimized first target internal parameter of the distance measurement device and an optimized second target internal parameter of the camera.
In the embodiment of the application, a distance measuring device is controlled to project a spot light beam to a target scene and collect the spot light beam to obtain a first target image, and a camera collecting calibration plate in a camera fusion system is synchronously controlled to obtain a second target image; acquiring three-dimensional coordinate information of a spot corresponding to the spot light beam under a world coordinate system according to the first target image; acquiring two-dimensional coordinate information of a spot corresponding to the spot light beam under a pixel coordinate system according to the second target image; determining a plurality of pairs of initial point pairs according to a preset projection rule, wherein the initial point pairs comprise the three-dimensional coordinate information and the two-dimensional coordinate information corresponding to the light spots; obtaining the matching confidence of each initial point pair; and taking the initial point pair with the matching confidence degree meeting the preset condition as a target point pair, and calculating external parameters between the distance measuring device and the camera according to the target point pair. According to the self-calibration scheme, special calibration equipment and a target scene are not relied on, under a complex use scene, the distance measurement device is modulated to emit spot beams according to a preset coding rule, and the spot beams projected to the target scene by the distance measurement device are imaged synchronously through the camera with infrared sensing light capability, so that real-time and online calibration and calibration of the distance measurement device and the camera are realized, limitation of resolution of the distance measurement device on parameter calibration precision is broken through, labor cost and time cost of calibration are reduced, accurate matching of 3D data and 2D data is guaranteed, and robustness and adaptability of the self-calibration scheme are improved.
The distance measuring device in this embodiment may be a distance measuring scheme based on a time-of-flight measuring principle by single-point/multi-point scanning and point array transceiving. Specifically, the distance measuring device may be various LiDAR schemes applied to automatic driving or intelligent robot scenes, such as depth cameras based on an array transceiving scheme (row-by-row transmitting and receiving or area array transceiving and receiving), a fixed dot array transceiving mode or a fixed dot array transceiving mode realized through a diffraction optical element, and the distance measuring device comprises a depth (distance) measuring scheme based on an iToF or dToF principle, and is applied to scenes such as indoor reconstruction, human body scanning, face recognition and the like in mobile phones and similar consumer electronic equipment. For the technical scheme of the application, the replacement of the related bottom hardware scheme does not affect the whole fusion and high-precision calibration scheme, and all similar schemes are within the protection scope of the patent.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application.
Referring to fig. 4, fig. 4 is a schematic diagram of a calibration device of a distance measuring device and camera fusion system according to a second embodiment of the present application. The units included are for performing the steps in the corresponding embodiment of fig. 4. Refer specifically to the description of the corresponding embodiment in fig. 4. For convenience of explanation, only the portions related to the present embodiment are shown. Referring to fig. 4, the calibration device 6 of the distance measuring apparatus and camera fusion system includes:
a control unit 410, configured to control the distance measurement device to project a spot beam to a target scene and collect the spot beam to obtain a first target image, and synchronously control the camera to collect the target scene to obtain a second target image; the first target image is an imaging result in an infrared band;
a first obtaining unit 420, configured to obtain three-dimensional coordinate information of a spot corresponding to the spot beam in a world coordinate system according to the first target image;
a second obtaining unit 430, configured to obtain, according to the second target image, two-dimensional coordinate information of a spot corresponding to the spot beam in a camera coordinate system;
a determining unit 440, configured to determine a plurality of pairs of initial points according to a preset projection rule, where the pairs of initial points include the three-dimensional coordinate information and the two-dimensional coordinate information corresponding to the light spot;
A third obtaining unit 450 for obtaining a matching confidence of each of the initial point pairs;
the first processing unit 460 is configured to take an initial point pair, for which the matching confidence degree meets a preset condition, as a target point pair, and calculate an external parameter between the distance measurement device and the camera according to the target point pair.
Further, the first obtaining unit 420 is specifically configured to:
acquiring a first coordinate of a spot corresponding to the spot light beam under a pixel coordinate system according to the first target image;
obtaining a distance measurement value corresponding to the spot light beam;
and calculating three-dimensional coordinate information of the light spot corresponding to the spot light beam under a world coordinate system according to the internal reference of the distance measuring device, the first coordinate and the distance measurement value.
Further, the first obtaining unit 420 is specifically configured to:
calculating the parallax value of the light spot corresponding to the spot light beam according to the distance measurement value,
correcting the spatial distribution of the light spots in the second target image according to the parallax value;
and determining a plurality of pairs of initial points by matching the corrected second target image and the first target image.
Further, the determining unit 440 is specifically configured to:
The preset projection rule is a time coding rule, and the three-dimensional coordinate information and the two-dimensional coordinate information corresponding to the same moment are matched to obtain a plurality of pairs of initial point pairs;
the preset projection rule is a space coding rule, and the space distribution of the light spots in the first target image and the second target image is matched to obtain a plurality of pairs of initial point pairs.
Further, the first processing unit 460 is specifically configured to:
constructing a reprojection error function model;
and carrying out iterative computation according to the target point pair and the re-projection error function model to obtain the minimum re-projection error and the external parameters between the distance measuring device and the camera.
Further, the first processing unit 460 is specifically configured to:
and optimizing a first internal parameter initial value of a depth camera of the distance measuring device and a second internal parameter initial value of a camera in the camera fusion system to obtain an optimized first target internal parameter of the distance measuring device and an optimized second target internal parameter of the camera.
Further, the calibration device 4 of the fusion system of the distance measuring device and the camera further comprises;
the second processing unit is used for determining overall confidence according to the matching confidence of each initial point pair;
And the third processing unit is used for judging that the working state is abnormal and stopping calculating the external parameters between the distance measuring device and the camera if the overall confidence coefficient is smaller than a preset threshold value.
Fig. 5 is a schematic diagram of a calibration device of a fusion system of a distance measuring device and a camera according to a third embodiment of the present application. As shown in fig. 5, the calibration device 5 of the distance measuring apparatus and camera fusion system of this embodiment includes: a processor 50, a memory 51 and a computer program 52 stored in said memory 51 and executable on said processor 50, for example a calibration program for a distance measuring device and camera fusion system. The processor 50, when executing the computer program 52, implements the steps of the calibration method embodiment of the above-described respective distance measuring device and camera fusion system, such as steps 101 to 106 shown in fig. 1. Alternatively, the processor 50, when executing the computer program 52, performs the functions of the modules/units of the apparatus embodiments described above, such as the functions of the modules 410-460 shown in fig. 4.
By way of example, the computer program 52 may be partitioned into one or more modules/units that are stored in the memory 51 and executed by the processor 50 to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing a specific function for describing the execution of the computer program 52 in the calibration device 5 of the distance measuring apparatus and camera fusion system. For example, the computer program 52 may be divided into a control unit, a first acquisition unit, a second acquisition unit, a determination unit, a third acquisition unit, and a first processing unit, each unit specifically functioning as follows:
The control unit is used for controlling the distance measuring device to project a spot light beam to a target scene and collect the spot light beam to obtain a first target image, and synchronously controlling the camera to collect the target scene to obtain a second target image; the second target image is an imaging result in an infrared band;
the first acquisition unit is used for acquiring three-dimensional coordinate information of the light spot corresponding to the spot light beam under a world coordinate system according to the first target image;
the second acquisition unit is used for acquiring two-dimensional coordinate information of the light spot corresponding to the spot light beam under a pixel coordinate system according to the second target image;
the determining unit is used for determining a plurality of pairs of initial point pairs according to a preset projection rule, wherein the initial point pairs comprise the three-dimensional coordinate information and the two-dimensional coordinate information corresponding to the light spots; the method comprises the steps of carrying out a first treatment on the surface of the
A third obtaining unit, configured to obtain a matching confidence of each initial point pair;
the first processing unit is used for taking an initial point pair with the matching confidence degree meeting a preset condition as a target point pair and calculating an external parameter between the distance measuring device and the camera according to the target point pair.
The calibration device of the distance measuring device and camera fusion system may include, but is not limited to, a processor 50, a memory 51. It will be appreciated by those skilled in the art that fig. 5 is merely an example of the calibration device 5 of the distance measuring apparatus and camera fusion system, and does not constitute a limitation of the calibration device 5 of the distance measuring apparatus and camera fusion system, and may include more or less components than illustrated, or may combine certain components, or different components, e.g., the calibration device of the distance measuring apparatus and camera fusion system may further include an input-output device, a network access device, a bus, etc.
The processor 50 may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 51 may be an internal storage unit of the calibration device 5 of the distance measuring apparatus and camera fusion system, for example a hard disk or a memory of the calibration device 5 of the distance measuring apparatus and camera fusion system. The memory 51 may also be an external storage device of the calibration device 5 of the distance measuring apparatus and the camera fusion system, for example, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card) or the like provided on the calibration device 5 of the distance measuring apparatus and the camera fusion system. Further, the calibration device 5 of the distance measuring device and camera fusion system may further include both an internal memory unit and an external memory device of the calibration device 5 of the distance measuring device and camera fusion system. The memory 51 is used for storing the computer program and other programs and data required for the calibration device of the distance measuring device and camera fusion system. The memory 51 may also be used to temporarily store data that has been output or is to be output.
It should be noted that, because the content of information interaction and execution process between the above devices/units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and will not be described herein.
The embodiment of the application also provides a network device, which comprises: at least one processor, a memory, and a computer program stored in the memory and executable on the at least one processor, which when executed by the processor performs the steps of any of the various method embodiments described above.
Embodiments of the present application also provide a computer readable storage medium storing a computer program which, when executed by a processor, implements steps for implementing the various method embodiments described above.
Embodiments of the present application provide a computer program product which, when run on a mobile terminal, causes the mobile terminal to perform steps that enable the implementation of the method embodiments described above.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiments, and may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing device/terminal apparatus, recording medium, computer Memory, read-Only Memory (ROM), random access Memory (RAM, random Access Memory), electrical carrier signals, telecommunications signals, and software distribution media. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other manners. For example, the apparatus/network device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (10)

1. The calibration method of the fusion system of the distance measuring device and the camera is characterized by comprising the following steps of:
controlling the distance measuring device to project a spot light beam to a target scene and acquire the spot light beam to obtain a first target image, and synchronously controlling the camera to acquire the target scene to obtain a second target image; the second target image is an imaging result in an infrared band, and the acquisition time of the first target image and the acquisition time of the second target image are synchronous;
Acquiring three-dimensional coordinate information of a spot corresponding to the spot beam under a world coordinate system according to the first target image;
acquiring two-dimensional coordinate information of a spot corresponding to the spot beam under a pixel coordinate system according to the second target image;
determining a plurality of pairs of initial point pairs according to a preset projection rule, wherein the initial point pairs comprise the three-dimensional coordinate information and the two-dimensional coordinate information corresponding to the light spots;
obtaining the matching confidence of each initial point pair;
and taking the initial point pair with the matching confidence degree meeting a preset condition as a target point pair, and calculating an external parameter between the distance measuring device and the camera according to the target point pair.
2. The method for calibrating a distance measurement device and camera fusion system according to claim 1, wherein the method for acquiring three-dimensional coordinate information of a spot corresponding to the spot beam in a world coordinate system according to the first target image further comprises:
acquiring a first coordinate of a spot corresponding to the spot light beam under a pixel coordinate system according to the first target image;
obtaining a distance measurement value corresponding to the spot light beam;
and calculating three-dimensional coordinate information of the light spot corresponding to the spot light beam under a world coordinate system according to the internal reference of the distance measuring device, the first coordinate and the distance measurement value.
3. The method for calibrating a distance measuring device and camera fusion system according to claim 2, further comprising, before determining the plurality of pairs of initial points according to a preset projection rule:
calculating the parallax value of the light spot corresponding to the spot light beam according to the distance measurement value,
correcting the spatial distribution of the light spots in the second target image according to the parallax value;
and determining a plurality of pairs of initial points by matching the corrected second target image and the first target image.
4. The method for calibrating a distance measuring device and camera fusion system according to claim 1, wherein the predetermined projection rule determines a plurality of pairs of initial points, comprising:
the preset projection rule is a time coding rule, and the three-dimensional coordinate information and the two-dimensional coordinate information corresponding to the same moment are matched to obtain a plurality of pairs of initial point pairs;
the preset projection rule is a space coding rule, and the space distribution of the light spots in the first target image and the second target image is matched to obtain a plurality of pairs of initial point pairs.
5. The method for calibrating a distance measurement device and camera fusion system according to claim 1, wherein calculating an external parameter between the distance measurement device and the camera from the target point pair comprises:
Constructing a reprojection error function model;
and carrying out iterative computation according to the target point pair and the re-projection error function model to obtain the minimum re-projection error and the external parameters between the distance measuring device and the camera.
6. The method for calibrating a distance measurement device and camera fusion system according to claim 5, further comprising, after performing iterative computation according to the target point pair and the re-projection error function model to obtain a minimum re-projection error and an external parameter between the distance measurement device and the camera:
and optimizing a first internal parameter initial value of a depth camera of the distance measuring device and a second internal parameter initial value of a camera in the camera fusion system to obtain an optimized first target internal parameter of the distance measuring device and an optimized second target internal parameter of the camera.
7. The method for calibrating a distance measuring device and camera fusion system according to claim 1, further comprising, after said obtaining a confidence of matching each of said initial point pairs;
determining overall confidence according to the matching confidence of each initial point pair;
and if the overall confidence coefficient is smaller than a preset threshold value, judging that the working state is abnormal, and stopping calculating the external parameters between the distance measuring device and the camera.
8. A calibration device for a fusion system of a distance measuring device and a camera, comprising:
the control unit is used for controlling the distance measuring device to project a spot light beam to a target scene and collect the spot light beam to obtain a first target image, and synchronously controlling the camera to collect the target scene to obtain a second target image; the second target image is an imaging result in an infrared band;
the first acquisition unit is used for acquiring three-dimensional coordinate information of the light spot corresponding to the spot light beam under a world coordinate system according to the first target image;
the second acquisition unit is used for acquiring two-dimensional coordinate information of a spot corresponding to the spot light beam under a pixel coordinate system according to the second target image, and acquisition time of the first target image and acquisition time of the second target image are synchronous;
the determining unit is used for determining a plurality of pairs of initial point pairs according to a preset projection rule, wherein the initial point pairs comprise the three-dimensional coordinate information and the two-dimensional coordinate information corresponding to the light spots;
a third obtaining unit, configured to obtain a matching confidence of each initial point pair;
the first processing unit is used for taking an initial point pair with the matching confidence degree meeting a preset condition as a target point pair and calculating an external parameter between the distance measuring device and the camera according to the target point pair.
9. Calibration device for a distance measuring apparatus and camera fusion system, comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 7 when executing the computer program.
10. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the method according to any one of claims 1 to 7.
CN202110680076.3A 2021-06-18 2021-06-18 Calibration method and device for distance measuring device and camera fusion system Active CN113538592B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110680076.3A CN113538592B (en) 2021-06-18 2021-06-18 Calibration method and device for distance measuring device and camera fusion system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110680076.3A CN113538592B (en) 2021-06-18 2021-06-18 Calibration method and device for distance measuring device and camera fusion system

Publications (2)

Publication Number Publication Date
CN113538592A CN113538592A (en) 2021-10-22
CN113538592B true CN113538592B (en) 2023-10-27

Family

ID=78125131

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110680076.3A Active CN113538592B (en) 2021-06-18 2021-06-18 Calibration method and device for distance measuring device and camera fusion system

Country Status (1)

Country Link
CN (1) CN113538592B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113538591B (en) * 2021-06-18 2024-03-12 深圳奥锐达科技有限公司 Calibration method and device for distance measuring device and camera fusion system
CN113888449B (en) * 2021-12-08 2022-02-22 深圳市先地图像科技有限公司 Image processing method and system for laser imaging and related equipment
CN114235351B (en) * 2021-12-17 2023-10-31 深圳市先地图像科技有限公司 Method, system and related equipment for detecting laser spot offset in laser array
CN114092569B (en) * 2022-01-19 2022-08-05 安维尔信息科技(天津)有限公司 Binocular camera online calibration method and system based on multi-sensor fusion
CN115170665B (en) * 2022-07-08 2023-08-01 北京航空航天大学 Image-based spherical object pose determination method and system
CN115231407B (en) * 2022-07-15 2023-09-15 日立楼宇技术(广州)有限公司 Displacement detection method, device and equipment of elevator and storage medium
CN115375772B (en) * 2022-08-10 2024-01-19 北京英智数联科技有限公司 Camera calibration method, device, equipment and storage medium
CN115097427B (en) * 2022-08-24 2023-02-10 北原科技(深圳)有限公司 Automatic calibration method based on time-of-flight method
CN117994121A (en) * 2022-10-28 2024-05-07 华为技术有限公司 Image processing method and electronic equipment
CN116336964B (en) * 2023-05-31 2023-09-19 天津宜科自动化股份有限公司 Object contour information acquisition system
CN117238143B (en) * 2023-09-15 2024-03-22 北京卓视智通科技有限责任公司 Traffic data fusion method, system and device based on radar double-spectrum camera

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109272570A (en) * 2018-08-16 2019-01-25 合肥工业大学 A kind of spatial point three-dimensional coordinate method for solving based on stereoscopic vision mathematical model
CN110657785A (en) * 2019-09-02 2020-01-07 清华大学 Efficient scene depth information acquisition method and system
CN111856433A (en) * 2020-07-25 2020-10-30 深圳奥锐达科技有限公司 Distance measuring system and measuring method
CN112465912A (en) * 2020-11-18 2021-03-09 新拓三维技术(深圳)有限公司 Three-dimensional camera calibration method and device
CN112862897A (en) * 2021-01-29 2021-05-28 武汉惟景三维科技有限公司 Phase-shift encoding circle-based rapid calibration method for camera in out-of-focus state
CN112907727A (en) * 2021-01-25 2021-06-04 中国科学院空天信息创新研究院 Calibration method, device and system of relative transformation matrix

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109272570A (en) * 2018-08-16 2019-01-25 合肥工业大学 A kind of spatial point three-dimensional coordinate method for solving based on stereoscopic vision mathematical model
CN110657785A (en) * 2019-09-02 2020-01-07 清华大学 Efficient scene depth information acquisition method and system
CN111856433A (en) * 2020-07-25 2020-10-30 深圳奥锐达科技有限公司 Distance measuring system and measuring method
CN112465912A (en) * 2020-11-18 2021-03-09 新拓三维技术(深圳)有限公司 Three-dimensional camera calibration method and device
CN112907727A (en) * 2021-01-25 2021-06-04 中国科学院空天信息创新研究院 Calibration method, device and system of relative transformation matrix
CN112862897A (en) * 2021-01-29 2021-05-28 武汉惟景三维科技有限公司 Phase-shift encoding circle-based rapid calibration method for camera in out-of-focus state

Also Published As

Publication number Publication date
CN113538592A (en) 2021-10-22

Similar Documents

Publication Publication Date Title
CN113538592B (en) Calibration method and device for distance measuring device and camera fusion system
CN113538591B (en) Calibration method and device for distance measuring device and camera fusion system
CN110596721B (en) Flight time distance measuring system and method of double-shared TDC circuit
CN110596722B (en) System and method for measuring flight time distance with adjustable histogram
CN111830530B (en) Distance measuring method, system and computer readable storage medium
CN110596725B (en) Time-of-flight measurement method and system based on interpolation
CN110596724B (en) Method and system for measuring flight time distance during dynamic histogram drawing
CN110596723B (en) Dynamic histogram drawing flight time distance measuring method and measuring system
US11675082B2 (en) Method and device for optical distance measurement
CN113780349B (en) Training sample set acquisition method, model training method and related device
CN111045029A (en) Fused depth measuring device and measuring method
CN111766596A (en) Distance measuring method, system and computer readable storage medium
CN110986816B (en) Depth measurement system and measurement method thereof
CN113466836A (en) Distance measurement method and device and laser radar
EP4047386B1 (en) Depth detection apparatus and electronic device
US20220364849A1 (en) Multi-sensor depth mapping
CN114296057A (en) Method, device and storage medium for calculating relative external parameter of distance measuring system
CN111796296A (en) Distance measuring method, system and computer readable storage medium
CN111796295A (en) Collector, manufacturing method of collector and distance measuring system
CN216133412U (en) Distance measuring device and camera fusion system
CN114549609A (en) Depth measurement system and method
CN112987021B (en) Structured light three-dimensional imaging system and method integrating time-of-flight method and structured light method
WO2024193131A1 (en) Tof and structured light fused depth camera and depth detection method thereof, and sweeper
CN213091889U (en) Distance measuring system
CN116485862A (en) Depth data calibration and calibration method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant