CN113219475B - Method and system for correcting monocular distance measurement by using single line laser radar - Google Patents

Method and system for correcting monocular distance measurement by using single line laser radar Download PDF

Info

Publication number
CN113219475B
CN113219475B CN202110763071.7A CN202110763071A CN113219475B CN 113219475 B CN113219475 B CN 113219475B CN 202110763071 A CN202110763071 A CN 202110763071A CN 113219475 B CN113219475 B CN 113219475B
Authority
CN
China
Prior art keywords
image
ranging
monocular camera
matrix
distance measurement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110763071.7A
Other languages
Chinese (zh)
Other versions
CN113219475A (en
Inventor
翟涌
石兆宁
张幽彤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN202110763071.7A priority Critical patent/CN113219475B/en
Publication of CN113219475A publication Critical patent/CN113219475A/en
Application granted granted Critical
Publication of CN113219475B publication Critical patent/CN113219475B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The invention provides a method and a system for correcting monocular distance measurement by using a single-line laser radar, wherein the method comprises the following steps: acquiring an image acquired by the monocular camera; inputting the image into a distance measurement neural network model trained in advance to obtain distance information of a target in the image; the ranging neural network model is obtained by training a training data set and a global reference error matrix determined by the ranging of the plurality of laser radars, wherein the training data set comprises a plurality of images and actual distances of pixels in the images. The invention adopts a mode of combining the laser radar and the monocular camera, provides multipoint accurate distance information by utilizing a plurality of laser radars so as to correct the global error of monocular distance measurement, can simplify the algorithm difficulty of distance measurement, simultaneously improves the precision, reduces the cost, and has good fusion with various monocular distance measurement algorithms.

Description

Method and system for correcting monocular distance measurement by using single line laser radar
Technical Field
The invention relates to the technical field of distance measurement, in particular to a method and a system for correcting monocular distance measurement by using a single-line laser radar.
Background
The current common unmanned vehicle distance measurement scheme is to map a target from a two-dimensional image to a three-dimensional point cloud so as to draw the position and the boundary of the target in a three-dimensional space, namely depth information is mainly obtained by a continuously rotating laser radar, which puts higher requirements on the rotating speed and the wire harness of the laser radar, and although the precision of the laser radar is better, the cost is greatly increased. If only monocular distance measurement is used, the precision of the existing monocular distance measurement is insufficient.
Disclosure of Invention
The invention solves the problem that the distance measurement scheme can not give consideration to both precision and cost.
In order to solve the above problems, the present invention provides a method for correcting monocular distance measurement using a single line lidar, which is applied to a distance measurement system including a monocular camera and a plurality of lidars, wherein the monocular camera and the plurality of lidars are arranged at predetermined positions, and the method includes: acquiring an image acquired by the monocular camera; inputting the image into a distance measurement neural network model trained in advance to obtain distance information of a target in the image; the ranging neural network model is obtained by training a training data set and a global reference error matrix determined by the ranging of the plurality of laser radars, wherein the training data set comprises a plurality of images and actual distances of pixels in the images.
Optionally, the training process of the ranging neural network model is as follows: extracting pixel characteristics of each image in the training data set; and coupling the pixel characteristic corresponding matrix and the global reference error matrix of the image, and inputting the coupled pixel characteristic corresponding matrix and the global reference error matrix of the image into a ranging neural network model for training until a loss function meets a preset termination condition.
Optionally, an optical axis of the monocular camera is arranged in parallel with a ray direction of each of the laser radars; at least one the lidar is close to the monocular camera is arranged, and the other the lidar surrounds the monocular camera is evenly arranged.
Optionally, the pixel feature corresponding matrix and the global reference error matrix of the image are coupled by the following formula:
Figure 750668DEST_PATH_IMAGE001
wherein the content of the first and second substances,L n is the coupled distance error matrix, gamma is the correction coefficient,L 0a global reference error matrix determined for the plurality of lidar rangefinders,S n is a matrix of distances from the center point of the image for all pixels in the image,O n a matrix formed by the ratio of any circle of confusion in the image relative to the circle of confusion at the center point of the image or the circle of confusion close to the center point of the image,k1、k2 is the number of times of pre-scaling, ⨀ represents the element product operation.
Optionally, the distance measurement result of the laser radar close to the monocular camera and the error of the distance measurement result of the monocular camera are used as the global reference errorL 0 Taking the ranging result of the laser radar uniformly distributed around the monocular camera and the error of the ranging result of the monocular camera as local parametersError of examinationL 1,L 0The following were used:
Figure 565040DEST_PATH_IMAGE002
wherein λ is an influence coefficient of the global reference error and the local reference error.
Optionally, the method further comprises: performing polynomial fitting on errors of different positions measured by a straight line along the radial direction to determine the times of relative positionsk1; performing multiform fitting on errors at different positions obtained by measuring one straight line along the axial direction to determine the times of the fuzzy circlek2。
Optionally, the ranging points of the lidar, which are uniformly arranged around the monocular camera, all fall on a diagonal of the image, and each of the ranging points is uniformly distributed on the diagonal.
Optionally, the ranging points of the lidar, which are uniformly arranged around the monocular camera, all fall on a center line of the image, and each ranging point is uniformly distributed on the center line.
The invention provides a system for correcting monocular distance measurement by utilizing a single-line laser radar, which comprises a distance measurement neural network model, a monocular camera and a plurality of laser radars; the optical axis of the monocular camera is arranged in parallel with the ray direction of each laser radar; at least one laser radar is arranged close to the monocular camera, and the rest laser radars are uniformly arranged around the monocular camera; the ranging neural network model is used for determining distance information of a target in an input monocular camera collecting image, the ranging neural network model is obtained by training a training data set and a global reference error matrix determined by the ranging of the plurality of laser radars, and the training data set comprises a plurality of images and actual distances of pixels in the images.
Optionally, the ranging points of the laser radar uniformly arranged around the monocular camera all fall on a diagonal of the image, and each ranging point is uniformly distributed on the diagonal; or the ranging points of the laser radar which are uniformly distributed around the monocular camera all fall on a center line of the image, and the ranging points are uniformly distributed on the center line.
The invention adopts a mode of combining the laser radar and the monocular camera, provides multipoint accurate distance information by utilizing a plurality of laser radars so as to correct the global error of monocular distance measurement, can simplify the algorithm difficulty of distance measurement, simultaneously improves the precision, reduces the cost, and has good fusion with various monocular distance measurement algorithms.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a schematic diagram illustrating a defocus distance measurement algorithm according to an embodiment of the present invention;
FIG. 2 is a schematic flowchart of a method for calibrating monocular distance measurement using a single line laser radar according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a training process of a monocular distance-measuring correction neural network model according to an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The distance information of the target in the embodiment of the invention is mainly acquired by a monocular camera. The precision of the existing monocular distance measurement algorithm is generally not high, the embodiment of the invention adopts a single line laser radar to correct the depth value of a specific pixel point in a two-dimensional image, and then uses the correction information of the single pixel point to correct the whole image, thereby improving the precision of monocular distance measurement.
It should be noted that the ranging errors of the points in the monocular ranging algorithm are inter-related. Taking the defocus-ranging algorithm as an example: defocusing three-dimensional measurement directly utilizes the relationship among object depth, camera lens parameters and image fuzziness to measure the depth of an object.
Fig. 1 shows a schematic diagram of the defocus distance measurement algorithm. When the image point is not coincident with the image pickup plane c, the object point a forms a fuzzy light spot which is not a clear image point but has the same shape with the aperture diaphragm on the image pickup plane c, and for a general defocusing distance measuring system, the aperture diaphragm has circular symmetry, so that the fuzzy light spot is a circular light spot. As shown in fig. 1, the radius R of the blur spot1、R2Optical parameters of the optical system (e.g. lens aperture D, distance s of imaging surface c from lens b1、s2) And the object distance u of the object point a has the following relationship:
Figure 823721DEST_PATH_IMAGE003
(1)
Figure 90754DEST_PATH_IMAGE004
(2)
Figure 289654DEST_PATH_IMAGE005
(3)
with the optical parameters of the system known, the object distance u of a certain point on the object can be calculated by measuring the radius of the fuzzy light spot of the point. Therefore, the monocular distance measurement at this time can be considered as a definite function:
Figure 958533DEST_PATH_IMAGE006
(4)
wherein the content of the first and second substances,Lis an optical parameter.
But the optical parameters will be for R1,R2Some deterministic but complex impact occurs:
Figure 889580DEST_PATH_IMAGE007
(5)
τan algorithm for solving the blur radius from the camera image:
Figure 643909DEST_PATH_IMAGE008
(6)
i.e. the algorithm is influenced by the actual object distance and some uncertainty factors.
Therefore, the distance measurement error for each measurement should have a definite relationship with the actual object distance, optical parameters, and the like, but it is very complicated and it is difficult to obtain the analytical expression. Since the distances of the respective points are determined by the same relationship, the errors of the respective points thus determined inevitably have an internal correlation therebetween. If this correlation can be used, the error of one point can be used to extrapolate the error of the other point, thereby improving the accuracy of monocular distance measurement.
Neural networks can just establish correlations that are not easy to find. To illustrate how such a neural network functions, the individual data and their acquisition are first described: a simple case is that the main optical axis of the monocular camera and the single line lidar are on the same straight line, so that the distance measured by the lidar is always the distance of the pixel point in the center of the camera image. By using defocusing distance measurement algorithm, a prediction distance matrix S of the object in the image can be obtained1
S1=[[s11,s12,s13,…]
[s21,s22,s23,…]
… … …]
In order to obtain the labels of the training set, the real distance between an object in the image and the camera is measured to form a label matrix Y, and then a neural network model can be established and trained. After training, the model can be used for carrying out target ranging on the image collected by the monocular camera.
Referring to fig. 2, a schematic flow chart of a method for correcting monocular distance measurement using a single line lidar is shown, and the method is applied to a distance measurement system including a monocular camera and a plurality of lidars, the monocular camera and the plurality of lidars are arranged according to a preset position, and the method includes the following steps:
and S202, acquiring an image acquired by the monocular camera.
And S204, inputting the image into a pre-trained ranging neural network model to obtain the distance information of the target in the image.
The ranging neural network model is obtained by training a training data set and a global reference error matrix determined by ranging of a plurality of laser radars, wherein the training data set comprises a plurality of images and actual distances of pixels in the images.
The monocular camera and the plurality of laser radars need to be arranged according to preset positions, so that the ranging result of the monocular camera can be corrected through the ranging result of the laser radars. In order to improve the correction effect and consider the cost, the embodiment provides a specific arrangement mode as follows: the optical axis of the monocular camera is arranged in parallel with the ray direction of each laser radar; at least one laser radar is arranged close to the monocular camera, and other laser radars are evenly arranged around the monocular camera.
The ranging point of the laser radar arranged close to the monocular camera is as close as possible to the central point of the defocused image acquired by the monocular camera. Optionally, the ranging points of the laser radar uniformly arranged around the monocular camera all fall on the diagonal of the image, and all the ranging points are uniformly distributed on the diagonal; or the ranging points of the laser radar uniformly distributed around the monocular camera all fall on the center line of the image, and all the ranging points are uniformly distributed on the center line.
The global reference error matrix of the monocular camera ranging is determined through the multiple laser radar ranging, the matrix is coupled with the pixel characteristic matrix of the image and then input into the distance neural network model for training, regression fitting can be carried out on the relation between the error and characteristic parameters such as actual object distance and optical parameters, and the distance neural network model capable of accurately ranging is obtained.
According to the method for correcting monocular distance measurement by using the single-line laser radar, provided by the embodiment of the invention, the mode that the laser radar is combined with the monocular camera is adopted, the plurality of laser radars are used for providing multipoint accurate distance information to correct the global error of monocular distance measurement, the algorithm difficulty of distance measurement can be simplified, the precision is improved, the cost is reduced, and the method is good in fusion with various monocular distance measurement algorithms.
Optionally, the training process of the ranging neural network model is as follows:
firstly, extracting pixel characteristics of each image in a training data set; and then, coupling the pixel characteristic corresponding matrix and the global reference error matrix of the image, and inputting the coupled pixel characteristic corresponding matrix and the global reference error matrix of the image into a ranging neural network model for training until the loss function meets a preset termination condition.
The pixel characteristics of the image comprise a position matrix and a fuzzy circle size matrix of each pixel, wherein the position matrix represents the distance between each pixel and the central point, the fuzzy circle size matrix represents the ratio of each fuzzy circle relative to the fuzzy circle of the central point or the fuzzy circle close to the central point, and the loss function is the difference value between the model predicted distance and the actual distance.
The inputs to the ranging neural network model include: distance prediction matrix A of original image and monocular camera1An original correction matrix B; the output is: the final prediction matrix a' of distances.
The information provided by the original image includes the position of the pixel point, the size of the circle of confusion, the whole target, etc., and the relative position, the relative size of the circle of confusion, etc., are used as the basis for correcting the correction matrix B, that is, the corrected correction matrix:
b' = f (position, circle of confusion, B) (7)
The relation between the input and the output is as follows:
A'=A1+B' (8)
if a true distance matrix A is obtained by actual measurement, then B' = A-A1I.e. the desired optimized correction matrix, B' -B can thus be used as the objective function.
Specifically, the pixel feature correspondence matrix and the global reference error matrix coupling formula of the image are as follows:
Figure 380921DEST_PATH_IMAGE009
(9)
wherein the content of the first and second substances,L n is the coupled distance error matrix, gamma is the correction coefficient,L 0a global reference error matrix determined for a plurality of lidar rangefinders,S n is a matrix of distances from the center point of the image for all pixels in the image,O n a matrix formed by the ratio of any circle of confusion in the image relative to the circle of confusion at the center point of the image or the circle of confusion close to the center point of the image,k1、k2 is the number of times of pre-scaling, ⨀ represents the element product operation. Number of fuzzy circle size matrix in coupling formulak2. Number of times of relative position matrixk1 was obtained by calibration.
If only the relation between the relative position and the distance measurement error is considered, letL 0For monocular distance measurement errors measured by the laser distance measurement points (for simplicity, only the case that the laser distance measurement points are the central points of the pixels of the camera is considered here), the distance between any pixel point and the central point can be known from priori knowledgeS nThe following relationships exist:
Figure 904306DEST_PATH_IMAGE010
(10)
wherein the content of the first and second substances,L nis SnCorresponding pixel points; alpha is a correction coefficient; since the error of the center point after subtracting the error of the lidar range is considered to be 0, there is no offset, ⨀ represents the element product operation.
Similarly, let the size of any fuzzy circle relative to the fuzzy circle at or near the center beO n If the circle is not a circle, then:
Figure 271834DEST_PATH_IMAGE011
(11)
wherein the content of the first and second substances,L nis SnCorresponding pixel points;βis a correction factor.
Assuming that the blur circle and the pixel position are independent, there are:
Figure 513459DEST_PATH_IMAGE012
multiplying the 3 matrixes according to the formula, putting the multiplied matrixes into a fully-connected neural network for training, and optimizing the actual error measured by experiments to ensure that the final result converges on an actual distance matrix so as to obtain a trained monocular distance measurement correction neural network model.
Referring to a schematic diagram of a training process of the monocular distance measurement correcting neural network model shown in fig. 3, an image shot by a monocular camera is processed by a correlation optical formula to obtain a fuzzy circle size matrix
Figure 788583DEST_PATH_IMAGE013
Coupled original correction matrix and position matrix
Figure 369737DEST_PATH_IMAGE014
And the depth maps are used as the input of the fully-connected neural network, the output of the neural network is a corrected depth map, the corrected depth map is compared with the actually measured depth map, and the weight of the neural network is changed through a gradient descent algorithm as a loss function until the loss function result is acceptable.
In practical applications, in order to further improve the correction effect and consider cost and other factors, the present embodiment exemplarily proposes an improved method using an arrangement form of 1 camera and 5 lidar and a coupling formula.
The main optical axis of the monocular camera is arranged in parallel with the ray direction of the 5 single-line laser radars, and the fact that the pixel position of a laser radar ranging point for acquiring an image by the monocular camera is basically fixed is guaranteed. Wherein 1 lidar closes to the monocular camera fixed, makes laser range finding point be close to the central point of the defocused image that the camera obtained as far as possible, and other 4 lidar keep away from the monocular camera slightly and around camera evenly arranged, 2 diagonal lines of 3 partition images respectively of 4 laser range finding points for example. The positions of the pixel points of the camera image corresponding to the laser ranging points are basically located at the center of the whole image and at the trisection points of 2 diagonal lines.
For the improvement of the coupling formula, the error obtained by the distance measurement of the laser radar close to the central point and the monocular distance measurement calculation is used as the global reference errorL 0Dividing the image into upper left, upper right, lower left and lower right according to pixel position, and taking the error of monocular distance measurement calculated by four laser radar distance measurement farther from the camera as the four local reference errorsL 11L 12L 21L 22. Will be provided withL 11L 12L 21L 22Filling the corresponding position of the error matrix according to the number of pixels to obtain a local error matrixL 1Namely:
L 1=[[L 11 L 12]
[L 21 L 22]]
specifically, the error between the ranging result of the laser radar close to the monocular camera and the ranging result of the monocular camera is used as the global reference errorL 0 Taking the error between the ranging result of the laser radar uniformly distributed around the monocular camera and the ranging result of the monocular camera as a local reference errorL 1L 0The following were used:
L 0=L 0'+λ∙L 1
wherein λ is an influence coefficient of the global reference error and the local reference error.
Optionally, the method further comprises the number of times of blurring the circle size matrixk2. Number of times of relative position matrixk1, since the cameras are different, the times of circle of confusion, position, etc. should be different, and the times can be calibrated as follows:
Performing polynomial fitting on errors of different positions measured by a straight line along the radial direction to determine the times of relative positionsk1; performing multiform fitting on errors at different positions obtained by measuring one straight line along the axial direction to determine the times of the fuzzy circlek2。
Placing test target objects at different axial and radial positions of a camera to respectively obtain relation data of errors, fuzzy circles and pixel positions, and calibrating the size of the fuzzy circle by fitting the relation between the size, the position and the errors of the fuzzy circle through a polynomialO n And pixel positionS n The number of times.
The error checking method in this embodiment is also applicable to other monocular distance measurement algorithms, and if an arrangement manner in which the laser radar and the camera form a certain angle is adopted, matching between the laser distance measurement point and the camera pixel point should be performed.
The embodiment of the invention also provides a system for correcting monocular distance measurement by using the single line laser radar, which can execute the method for correcting monocular distance measurement by using the single line laser radar provided by the embodiment, and specifically comprises a distance measurement neural network model, a monocular camera and a plurality of laser radars.
The optical axis of the monocular camera is arranged in parallel with the ray direction of each laser radar; at least one laser radar is arranged close to the monocular camera, and the other laser radars are uniformly arranged around the monocular camera; the ranging neural network model is used for determining distance information of a target in an input monocular camera collecting image, and is obtained by training a training data set and a global reference error matrix determined by ranging of a plurality of laser radars, wherein the training data set comprises a plurality of images and actual distances of pixels in the images.
According to the system for correcting the monocular distance measurement by using the single-line laser radar, provided by the embodiment of the invention, the mode that the laser radar is combined with the monocular camera is adopted, and the plurality of laser radars are used for providing multipoint accurate distance information to correct the global error of the monocular distance measurement, so that the algorithm difficulty of the distance measurement can be simplified, the accuracy is improved and the cost is reduced.
Optionally, the ranging points of the laser radar uniformly arranged around the monocular camera all fall on the diagonal of the image, and all the ranging points are uniformly distributed on the diagonal; or the ranging points of the laser radar uniformly distributed around the monocular camera all fall on the center line of the image, and all the ranging points are uniformly distributed on the center line.
Those skilled in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by instructing a control device to implement the methods, and the programs may be stored in a computer-readable storage medium, and when executed, the programs may include the processes of the above method embodiments, where the storage medium may be a memory, a magnetic disk, an optical disk, and the like.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (8)

1. A method for correcting monocular distance measurement by using a single line laser radar is characterized by being applied to a distance measurement system comprising a monocular camera and a plurality of laser radars, wherein the monocular camera and the plurality of laser radars are arranged according to preset positions, and the method comprises the following steps:
acquiring an image acquired by the monocular camera;
inputting the image into a distance measurement neural network model trained in advance to obtain distance information of a target in the image; the ranging neural network model is obtained by training a training data set and a global reference error matrix determined by the ranging of the plurality of laser radars, wherein the training data set comprises a plurality of images and actual distances of pixels in the images;
the training process of the ranging neural network model comprises the following steps: extracting pixel characteristics of each image in the training data set; coupling the pixel characteristic corresponding matrix and the global reference error matrix determined by the plurality of laser radar ranging, and inputting the coupled pixel characteristic corresponding matrix and the global reference error matrix into a ranging neural network model for training until a loss function meets a preset termination condition;
the pixel characteristic corresponding matrix and the global reference error matrix coupling formula determined by the plurality of laser radar ranging are as follows:
Figure 1195DEST_PATH_IMAGE001
wherein the content of the first and second substances,L n is the coupled distance error matrix, gamma is the correction coefficient,L 0a global reference error matrix determined for the plurality of lidar rangefinders,S n is a matrix of distances from the center point of the image for all pixels in the image,O n for arbitrary modes in the imageA matrix composed of the ratio of the circle of confusion to the circle of confusion at or near the center point of the image,k1、k2 is the number of times of pre-scaling, ⨀ represents the element product operation.
2. The method according to claim 1, characterized in that the optical axis of the monocular camera is arranged parallel to the ray direction of each of the lidar;
at least one the lidar is close to the monocular camera is arranged, and the other the lidar surrounds the monocular camera is evenly arranged.
3. The method of claim 1, wherein an error between the ranging result of the lidar proximate to the monocular camera and the ranging result of the monocular camera is used as a global reference errorL 0 Taking the ranging result of the laser radar uniformly distributed around the monocular camera and the error of the ranging result of the monocular camera as local reference errorsL 1L 0The following were used:
Figure 396405DEST_PATH_IMAGE002
wherein λ is an influence coefficient of the global reference error and the local reference error.
4. The method of claim 1, further comprising:
performing polynomial fitting on errors of different positions measured by a straight line along the radial direction to determine the times of relative positionsk1;
Performing multiform fitting on errors at different positions obtained by measuring one straight line along the axial direction to determine the times of the fuzzy circlek2。
5. The method of claim 2, wherein the range points of the lidar arranged uniformly around the monocular camera each fall on a diagonal of the image, and wherein each of the range points is distributed uniformly on the diagonal.
6. The method of claim 2, wherein the range points of the lidar arranged uniformly around the monocular camera all lie on a centerline of the image, and each range point is distributed uniformly on the centerline.
7. A system for correcting monocular distance measurement by utilizing a single line laser radar is characterized by comprising a distance measurement neural network model, a monocular camera and a plurality of laser radars;
the optical axis of the monocular camera is arranged in parallel with the ray direction of each laser radar;
at least one laser radar is arranged close to the monocular camera, and the rest laser radars are uniformly arranged around the monocular camera;
the ranging neural network model is used for determining distance information of a target in an input monocular camera acquisition image, the ranging neural network model is obtained by training a training data set and a global reference error matrix determined by the ranging of the plurality of laser radars, and the training data set comprises a plurality of images and actual distances of pixels in the images;
the training process of the ranging neural network model comprises the following steps: extracting pixel characteristics of each image in the training data set; coupling the pixel characteristic corresponding matrix and the global reference error matrix determined by the plurality of laser radar ranging, and inputting the coupled pixel characteristic corresponding matrix and the global reference error matrix into a ranging neural network model for training until a loss function meets a preset termination condition;
the pixel characteristic corresponding matrix and the global reference error matrix coupling formula determined by the plurality of laser radar ranging are as follows:
Figure 210777DEST_PATH_IMAGE001
wherein the content of the first and second substances,L n is the coupled distance error matrix, gamma is the correction coefficient,L 0a global reference error matrix determined for the plurality of lidar rangefinders,S n is a matrix of distances from the center point of the image for all pixels in the image,O n a matrix formed by the ratio of any circle of confusion in the image relative to the circle of confusion at the center point of the image or the circle of confusion close to the center point of the image,k1、k2 is the number of times of pre-scaling, ⨀ represents the element product operation.
8. The system of claim 7, wherein the range points of the lidar arranged uniformly around the monocular camera each fall on a diagonal of the image, and each of the range points is distributed uniformly on the diagonal; alternatively, the first and second electrodes may be,
the ranging points of the laser radar which are uniformly distributed around the monocular camera all fall on a center line of an image, and the ranging points are uniformly distributed on the center line.
CN202110763071.7A 2021-07-06 2021-07-06 Method and system for correcting monocular distance measurement by using single line laser radar Active CN113219475B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110763071.7A CN113219475B (en) 2021-07-06 2021-07-06 Method and system for correcting monocular distance measurement by using single line laser radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110763071.7A CN113219475B (en) 2021-07-06 2021-07-06 Method and system for correcting monocular distance measurement by using single line laser radar

Publications (2)

Publication Number Publication Date
CN113219475A CN113219475A (en) 2021-08-06
CN113219475B true CN113219475B (en) 2021-10-22

Family

ID=77081050

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110763071.7A Active CN113219475B (en) 2021-07-06 2021-07-06 Method and system for correcting monocular distance measurement by using single line laser radar

Country Status (1)

Country Link
CN (1) CN113219475B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106780588A (en) * 2016-12-09 2017-05-31 浙江大学 A kind of image depth estimation method based on sparse laser observations
CN112070659A (en) * 2020-09-08 2020-12-11 苏州易航远智智能科技有限公司 Method for 3D information correction by using deep convolutional neural network
CN112241978A (en) * 2020-10-21 2021-01-19 广州小鹏自动驾驶科技有限公司 Data processing method and device
CN112484746A (en) * 2020-11-26 2021-03-12 上海电力大学 Monocular vision-assisted laser radar odometer method based on ground plane

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10671082B2 (en) * 2017-07-03 2020-06-02 Baidu Usa Llc High resolution 3D point clouds generation based on CNN and CRF models

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106780588A (en) * 2016-12-09 2017-05-31 浙江大学 A kind of image depth estimation method based on sparse laser observations
CN112070659A (en) * 2020-09-08 2020-12-11 苏州易航远智智能科技有限公司 Method for 3D information correction by using deep convolutional neural network
CN112241978A (en) * 2020-10-21 2021-01-19 广州小鹏自动驾驶科技有限公司 Data processing method and device
CN112484746A (en) * 2020-11-26 2021-03-12 上海电力大学 Monocular vision-assisted laser radar odometer method based on ground plane

Also Published As

Publication number Publication date
CN113219475A (en) 2021-08-06

Similar Documents

Publication Publication Date Title
US9826217B2 (en) System and method for adjusting a baseline of an imaging system with microlens array
CN105627932A (en) Distance measurement method and device based on binocular vision
EP2568253A1 (en) Structured-light measuring method and system
CN108759788B (en) Unmanned aerial vehicle image positioning and attitude determining method and unmanned aerial vehicle
CN108961184A (en) A kind of bearing calibration of depth image, device and equipment
US11898875B2 (en) Method and apparatus for single camera optical measurements
Zhou et al. A novel way of understanding for calibrating stereo vision sensor constructed by a single camera and mirrors
CN113358231B (en) Infrared temperature measurement method, device and equipment
CN110345921A (en) Stereoscopic fields of view vision measurement and vertical axial aberration and axial aberration bearing calibration and system
CN111508011A (en) Depth data calibration method of flight time camera
CN111123242A (en) Combined calibration method based on laser radar and camera and computer readable storage medium
CN113255181A (en) Heat transfer inverse problem identification method and device based on deep learning
Kurmi et al. Pose error reduction for focus enhancement in thermal synthetic aperture visualization
CN113219475B (en) Method and system for correcting monocular distance measurement by using single line laser radar
CN116681730A (en) Target tracking method, device, computer equipment and storage medium
CN115830143A (en) Joint calibration parameter adjusting method and device, computer equipment and storage medium
CN115854995A (en) Foundation pile measuring system and method based on unmanned aerial vehicle surveying and mapping
CN113077518B (en) Camera parameter calibration method, device and storage medium
CN112508171A (en) Image depth estimation method and device based on multilayer convolutional neural network
CN114862808B (en) Determination method, device, equipment and storage medium for precision of dotted line frame
CN113610961B (en) Method for reconstructing dense depth from light field EPI based on Kalman filtering
JP2019522909A (en) Apparatus and method for generating data representing a pixel beam
CN116843749A (en) Image processing method, device and equipment
CN117635731A (en) Method for calibrating external parameters between camera and laser radar based on vanishing point method
CN116804788A (en) Automatic focusing method, automatic focusing device, computer equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant