CN111380503A - Monocular camera ranging method adopting laser-assisted calibration - Google Patents

Monocular camera ranging method adopting laser-assisted calibration Download PDF

Info

Publication number
CN111380503A
CN111380503A CN202010471623.2A CN202010471623A CN111380503A CN 111380503 A CN111380503 A CN 111380503A CN 202010471623 A CN202010471623 A CN 202010471623A CN 111380503 A CN111380503 A CN 111380503A
Authority
CN
China
Prior art keywords
calibration
aerial view
target
mapping
rectangle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010471623.2A
Other languages
Chinese (zh)
Other versions
CN111380503B (en
Inventor
周书田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN202010471623.2A priority Critical patent/CN111380503B/en
Publication of CN111380503A publication Critical patent/CN111380503A/en
Application granted granted Critical
Publication of CN111380503B publication Critical patent/CN111380503B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a monocular camera distance measurement method adopting laser-assisted calibration, and relates to the field of distance measurement calculation. The invention adopts a laser projection mode to mark a rectangle with known side length on the front horizontal plane; then, a monocular camera is adopted to collect images containing calibration rectangles and a target with a distance measurement; and projecting the acquired image into a bird's-eye view according to the known calibration rectangle, and finally calculating the distance from the target to be detected to the reference position according to the actual side length of the calibration rectangle. The monocular camera laser-assisted calibration distance measurement method free of camera modeling provided by the invention gets rid of a complex process of modeling a monocular sensor, skillfully converts the necessity of camera modeling into a perspective projection relation between a land horizontal plane and a camera aerial view plane, greatly reduces distance measurement errors, simplifies the monocular distance measurement process, and obtains an effective, accurate and stable monocular camera distance measurement result.

Description

Monocular camera ranging method adopting laser-assisted calibration
Technical Field
The invention relates to the field of distance measurement calculation, in particular to the field of distance measurement calculation of a monocular camera.
Background
The current distance-based application and demand scenarios are numerous and highly dependent, for example: the road monitoring camera judges the distance between the pedestrians and the running vehicles on the street; the driving camera judges the distance between a driving vehicle and a front vehicle so as to automatically keep the vehicle distance; the front camera of the mobile phone judges the distance between the shot object and the camera, so as to automatically virtualize the background; the VR headset utilizes a monocular camera to map the surrounding physical environment.
The distance measurement calibration algorithm is realized by conventionally generally utilizing a binocular camera, the distance between each object in an image and the camera can be accurately and efficiently obtained, but the distance measurement based on the binocular camera has the defects of high cost, complex calculation and the like. The monocular camera is widely applied to security, industrial production lines and mobile phones. Monocular cameras are less than half as expensive as binocular cameras and are substantially identical to the image content captured by binocular cameras when deployed, particularly in the industry. With the unprecedented improvement of the industrial manufacturing level of the camera, the stability of the monocular camera can be effectively guaranteed, the probability that the monocular camera does not generate faults when continuously working is close to two eyes, and the problem of input image redundancy is solved. Therefore, the monocular camera is used for accurate ranging, and the method has wide application prospect.
The distance measurement scheme based on the monocular camera in the prior art has various defects. Under an ideal environment of experimental conditions, monocular distance measurement can be completed to a certain extent; however, the monocular distance measurement effect with high precision and high accuracy is still poor under outdoor conditions. In recent years, monocular distance measurement algorithms and processes are very complex in modeling a camera and a real world, a camera model is approximated to a pinhole model by a plurality of schemes, and a transformation matrix of the distance from the real world to an image coordinate system is obtained according to pinhole imaging and the relation between an image coordinate system and the real world coordinate system by using similar triangles. The pinhole model can meet the application scene with low precision, but can not meet the application requirement of monocular distance measurement with high precision, the camera needs to be completely or semi-completely modeled, the process is complex, the requirement on each experimental condition in the distance measurement process is high, if the pinhole model can not be strictly executed according to the requirement, the distance measurement result with large error can be obtained, and the high-precision requirement provided by the application can not be met.
Disclosure of Invention
The invention aims to: aiming at the problems in the background technology, a monocular camera laser-assisted calibration distance measurement method without camera modeling is provided, so that stable and accurate monocular distance measurement is realized.
The technical scheme adopted by the invention is as follows: a monocular camera ranging method adopting laser-assisted calibration is characterized by comprising the following steps:
step 1: calibrating a rectangle with known side length on a front plane by adopting a laser projection mode to obtain a calibration rectangle;
step 2: acquiring an image containing a calibration rectangle and a target with a distance measurement by using a monocular camera to obtain an acquired image, wherein the calibration rectangle in the acquired image is displayed as a trapezoid;
and step 3: mapping the acquired image obtained in the step 2 into a bird-eye view, wherein the length-width ratio of the calibration rectangle in the bird-eye view is the same as the length-width ratio of the calibration rectangle in the step 1; calculating the actual distance represented by each pixel point according to the number of the pixel points occupied by the perimeter of the calibration rectangle in the aerial view and the actual length of the perimeter;
and 4, step 4: selecting different calculation methods according to the property of the target to be measured and the ranging environment, wherein the calculation methods comprise the following steps:
a. if the target to be measured is an industrial production line or a static distance measurement environment, the reference position is the bottom side of the calibration rectangle in the aerial view and the extension line of the bottom side; calculating the number of pixels from the target to be measured to the reference position in the image, and calculating the actual distance from the target to be measured to the reference position according to the actual distance represented by each pixel point;
b. if the monocular camera is positioned on the vehicle and the target to be measured is a small car, the reference position is the middle point of the bottom edge of the standard rectangular shape in the aerial view; calculating the number of pixels from the target to be measured to the reference position in the image, and calculating the actual distance from the target to be measured to the reference position according to the actual distance represented by each pixel point;
c. if the monocular camera is positioned on the vehicle and the target to be measured is a large transport vehicle or a large passenger car, the reference position is the middle point of the calibration rectangle in the aerial view; calculating an ellipse where a target to be measured is located, wherein the ellipse takes the reference position as the center of the ellipse, the bottom edge of the standard rectangle in the aerial view is the edge where the long axis of the ellipse is located, and the long axis of the ellipse is 2 times of the short axis; and determining half of the short axis of the ellipse as the distance from the target to be measured to the reference position, and calculating the actual distance from the target to be measured to the reference position according to the number of pixel points occupied by the half of the short axis and the actual distance represented by each pixel point.
Further, the calibration rectangle in the step 1 is a square.
Further, the specific method of step 3 is as follows:
step 3.1: establishing a blank graph of the aerial view, directly designating four vertexes of a calibration rectangle in the aerial view in the blank graph, wherein the length-width ratio of the rectangle formed by the designated four vertexes is the same as the length-width ratio of the calibration rectangle in the step 1;
step 3.2: sequentially and correspondingly mapping the four vertexes of the calibration rectangle in the image collected in the step 2 to the four vertexes specified in the blank image by adopting the following formula;
Figure 418105DEST_PATH_IMAGE001
Figure 150437DEST_PATH_IMAGE002
Figure 311029DEST_PATH_IMAGE003
representing the coordinates of the pixel points in the acquired image before mapping,
Figure 278854DEST_PATH_IMAGE004
Figure 592024DEST_PATH_IMAGE005
the coordinates of the pixel points in the mapped aerial view are obtained;
Figure 918094DEST_PATH_IMAGE006
the numerical value of the element is obtained according to the mapping process that the four vertexes of the calibration rectangle in the collected image are mapped to the four vertexes of the calibration rectangle in the aerial view;
step 3.3: calculating mapping matrixes of other pixel points in the collected image according to the mapping matrixes of the four vertexes calculated in the step 3.2, and mapping the pixel points to the aerial view;
step 3.4: and calculating the actual distance represented by each pixel point according to the number of the pixel points occupied by the perimeter of the calibration rectangle in the aerial view and the actual perimeter length.
Further, the specific method of step 3 is as follows:
step 3.1: establishing a blank graph of the aerial view, and directly copying a left lower vertex and a right lower vertex of a calibration rectangle in the acquired image into the aerial view without mapping transformation;
step 3.2: determining the positions of the left upper vertex and the right upper vertex of the calibration rectangle in the aerial view according to the length-width ratio of the calibration rectangle in the step 1; then, mapping is carried out by adopting the following formula to obtain a mapping matrix:
Figure 51135DEST_PATH_IMAGE001
Figure 947284DEST_PATH_IMAGE002
Figure 567491DEST_PATH_IMAGE003
representing the coordinates of the pixel points in the acquired image before mapping,
Figure 126779DEST_PATH_IMAGE004
Figure 12695DEST_PATH_IMAGE005
the coordinates of the pixel points in the mapped aerial view are obtained;
Figure 915798DEST_PATH_IMAGE006
the numerical value of the element is obtained according to the mapping process that the four vertexes of the calibration rectangle in the collected image are mapped to the four vertexes of the calibration rectangle in the aerial view;
step 3.3: calculating mapping matrixes of other pixel points in the collected image according to the mapping matrixes of the four vertexes in the step 3.1 and the step 3.2, and mapping the pixel points to the aerial view;
step 3.4: and calculating the actual distance represented by each pixel point according to the number of the pixel points occupied by the perimeter of the calibration rectangle in the aerial view and the actual length of the perimeter.
Further, the specific method of step 3 is as follows:
step 3.1: identifying a target to be measured in the collected image, and determining pixel points occupied by the target to be measured and the lower bottom edge of the target to be measured;
step 3.2: establishing a blank graph of the aerial view, and directly copying a left lower vertex and a right lower vertex of a calibration rectangle in the acquired image into the aerial view without mapping transformation;
step 3.3: determining the positions of the upper left vertex and the upper right vertex of the calibration rectangle in the aerial view according to the length-width ratio of the calibration rectangle in the step 1; mapping is carried out by adopting the following formula to obtain a mapping matrix:
Figure 78926DEST_PATH_IMAGE001
Figure 402591DEST_PATH_IMAGE002
Figure 713487DEST_PATH_IMAGE003
representing the coordinates of the pixel points in the acquired image before mapping,
Figure 154701DEST_PATH_IMAGE004
Figure 562549DEST_PATH_IMAGE005
the coordinates of the pixel points in the mapped aerial view are obtained;
Figure 463640DEST_PATH_IMAGE006
the numerical value of the element is obtained according to the mapping process that the four vertexes of the calibration rectangle in the collected image are mapped to the four vertexes of the calibration rectangle in the aerial view;
step 3.4: calculating mapping matrixes of other pixel points in the collected image except the pixel points occupied by the target to be measured according to the mapping matrixes of the four vertexes in the step 3.2 and the step 3.3, and mapping the pixel points to the aerial view to obtain a blank aerial view of the target to be measured;
step 3.5: directly copying the target to be measured determined in the step 3.1 into a blank area at the target to be measured in the aerial view without mapping transformation, so that the middle point of the lower bottom edge of the target to be measured determined in the step 3.1 is superposed with the middle point of the lower bottom edge of the hollow white area in the aerial view;
step 3.6: and calculating the actual distance represented by each pixel point according to the number of the pixel points occupied by the perimeter of the calibration rectangle in the aerial view and the actual perimeter length.
Further, the distance from the target to be measured to the reference position in the step 5 represents the distance from the middle point of the lower bottom edge of the target to be measured to the reference position.
The invention has the beneficial effects that:
1. the monocular camera ranging scheme of the invention adopts laser-assisted calibration to replace camera modeling, reduces the dependence on experimental environment, and can effectively obtain accurate and stable monocular camera ranging results under different areas, road conditions and weather conditions.
2. The monocular laser auxiliary calibration distance measurement scheme without camera modeling provided by the invention utilizes a laser auxiliary calibration process, is simple to realize and has an obvious effect on improving the calibration accuracy.
3. The monocular laser assisted calibration distance measurement scheme free of camera modeling provided by the invention gets rid of the complex process of modeling a monocular RGB (red, green and blue) sensor, skillfully converts the necessity of camera modeling into the perspective projection relation between a land horizontal plane and a camera aerial view plane, greatly reduces the distance measurement error caused by inaccurate camera modeling, and simplifies the monocular distance measurement process.
Drawings
FIG. 1 is a flow chart of a monocular camera ranging method using laser-assisted calibration according to the present invention.
Fig. 2 is a bird's-eye view after mapping transformation according to a first embodiment of the present invention.
FIG. 3 is an isometric line effect graph of a bird's eye view illustrating a straight line according to an embodiment of the present invention.
Fig. 4 is a schematic diagram of a projection calibration rectangle according to a second embodiment of the present invention.
Fig. 5 is an effect diagram of drawing circular equidistant lines in the captured image according to the second embodiment of the present invention.
Fig. 6 is an effect diagram of drawing an elliptical equidistant line in a captured image according to a third embodiment of the present invention.
Detailed Description
The flow chart of the technical scheme of the invention is shown in figure 1, and comprises the following steps:
step 1: calibrating a rectangle with known side length on a front plane by adopting a laser projection mode to obtain a calibration rectangle;
step 2: acquiring an image containing a calibration rectangle and a target with a distance measurement by using a monocular camera to obtain an acquired image, wherein the calibration rectangle in the acquired image is displayed as a trapezoid;
and step 3: mapping the acquired image obtained in the step 2 into a bird-eye view, wherein the length-width ratio of the calibration rectangle in the bird-eye view is the same as the length-width ratio of the calibration rectangle in the step 1; calculating the actual distance represented by each pixel point according to the number of the pixel points occupied by the perimeter of the calibration rectangle in the aerial view and the actual length of the perimeter;
and 4, step 4: and selecting different calculation methods according to the property of the target to be measured and the ranging environment, and calculating the actual distance.
Example one
A monocular camera ranging method adopting laser-assisted calibration is characterized by comprising the following steps:
step 1: marking a square with the side length of 5cm on a notebook computer keyboard by adopting a laser projection mode;
step 2: acquiring an image containing a calibration square and a notebook computer keyboard by adopting a monocular camera to obtain an acquired image, wherein the calibration square in the image is displayed as a trapezoid;
and step 3: mapping the image acquired in the step 2 into a bird's-eye view, and calculating the actual distance represented by each pixel point according to the number of the pixel points occupied by the perimeter of the specified square in the bird's-eye view and the actual length of the perimeter as shown in FIG. 2;
step 3.1: establishing a blank graph of the aerial view, and directly appointing four vertexes of a marked square in the aerial view in the blank graph;
step 3.2: sequentially and correspondingly mapping the four vertexes of the marked square in the image collected in the step 2 to the four vertexes of the marked square in the aerial view by adopting the following formula;
Figure 199515DEST_PATH_IMAGE001
Figure 726311DEST_PATH_IMAGE002
Figure 706774DEST_PATH_IMAGE003
representing the coordinates of the pixel points in the acquired image before mapping,
Figure 168980DEST_PATH_IMAGE004
Figure 251205DEST_PATH_IMAGE005
the coordinates of the pixel points in the mapped aerial view are obtained;
Figure 598004DEST_PATH_IMAGE006
the numerical value of the element is obtained according to the mapping process that the four vertexes of the calibration square in the collected image are mapped to the four vertexes of the calibration square in the aerial view;
step 3.3: 3.2, calculating mapping matrixes of other pixel points in the collected image according to the mapping matrixes of the four vertexes, and mapping the pixel points to the aerial view;
step 3.4: calculating the actual length represented by each pixel point according to the number of the pixel points occupied by the perimeter of the marked square in the aerial view and the actual length of the perimeter;
and 4, step 4: at the moment, the distance measurement reference position is a bottom edge extension line of a marked square in the aerial view; the number of pixels from the target to be measured to the reference position in the image is calculated, and then the distance from the target to be measured to the reference position is calculated according to the actual distance represented by each pixel point, as shown in fig. 3, the distance from any point on the notebook computer to the reference position can be calculated by adopting a straight line equidistant line and 5cm as the step length of the equidistant line, and the width of the keyboard of the notebook computer is calculated to be 30 cm.
Example two
The monocular camera is positioned on a vehicle, and the target to be measured is a small-sized family car, and the method comprises the following steps:
step 1: projecting two laser beams perpendicular to each other on a road surface in front in a laser projection mode, wherein the projection method is as shown in fig. 4, and a calibration rectangle with the bottom edge of 1m and the vertical edge of 1.5m is calibrated by taking the two laser beams as reference in the case that the vertical edge is parallel to the extending direction of the road surface;
step 2: acquiring an image containing a calibration rectangle and a ranging target by using a monocular camera to obtain an acquired image, wherein the calibration rectangle in the acquired image is displayed in a trapezoid shape, and four vertexes P1, P2, P3 and P4 of the calibration rectangle in the acquired image are determined;
and step 3: mapping the image acquired in the step 2 into a bird's-eye view, wherein the length-width ratio of the calibration rectangle in the bird's-eye view is the same as that of the calibration rectangle in the step 1; calculating the actual distance represented by each pixel point according to the number of the pixel points occupied by the perimeter of the calibration rectangle in the aerial view and the actual length of the perimeter;
step 3.1: establishing a blank graph of the aerial view to calibrate the right lower vertex P3 and the left lower vertex P4 of the rectangle to be directly mapped into the aerial view without mapping transformation;
step 3.2: determining the positions of the upper left vertex P1 and the upper right vertex P2 of the calibration rectangle in the bird's eye view according to the aspect ratio of the calibration rectangle in the step 1; mapping is carried out by adopting the following formula to obtain a mapping matrix:
Figure 918127DEST_PATH_IMAGE001
Figure 862818DEST_PATH_IMAGE002
Figure 432340DEST_PATH_IMAGE003
representing the coordinates of the pixel points in the acquired image before mapping,
Figure 51671DEST_PATH_IMAGE004
Figure 898404DEST_PATH_IMAGE005
the coordinates of the pixel points in the mapped aerial view are obtained;
Figure 561467DEST_PATH_IMAGE006
the numerical value of the element is obtained according to the mapping process that the four vertexes of the calibration rectangle in the collected image are mapped to the four vertexes of the calibration rectangle in the aerial view;
step 3.3: according to the mapping matrixes of the four vertexes of the calibration matrix in the step 3.1 and the step 3.2, the mapping matrixes of other pixel points in the collected image are calculated, and the pixel points are mapped to the aerial view;
step 3.4: and calculating the actual length represented by each pixel point according to the number of the pixel points occupied by the perimeter of the calibration rectangle in the aerial view and the actual length of the perimeter.
And 4, step 4: the distance measurement reference position is the middle point of the bottom edge of the standard rectangle in the aerial view; calculating the number of pixels from the target to be measured to the reference position in the image, and calculating the distance from the target to be measured to the reference position according to the actual distance represented by each pixel point; as shown in fig. 5, the present embodiment draws circular equidistant lines in the captured image in steps of 2.5 m.
EXAMPLE III
The monocular camera is positioned on the vehicle, and the target to be measured is a large transport vehicle; the method comprises the following steps:
step 1: marking a square with the side length of 4m on the road surface in front by adopting a laser projection mode;
step 2: acquiring an image containing a calibration square and a target with a distance measurement by adopting a monocular camera to obtain an acquired image, wherein the calibration square in the acquired image is displayed as a trapezoid;
and step 3: mapping the image acquired in the step (2) into a bird's-eye view, and calculating the actual distance represented by each pixel point according to the number of the pixel points occupied by the perimeter of the marked square in the bird's-eye view and the actual length of the perimeter;
step 3.1: identifying a target to be measured in the collected image, and determining pixel points occupied by the target to be measured and the lower bottom edge of the target to be measured;
step 3.2: establishing a blank graph of the aerial view, and directly copying the right lower vertex and the left lower vertex of the calibration square in the collected image into the aerial view;
step 3.3: determining the positions of the upper left vertex and the upper right vertex of the marked square in the aerial view according to the equal side length of the square; mapping is carried out by adopting the following formula to obtain a mapping matrix:
Figure 336394DEST_PATH_IMAGE001
Figure 8683DEST_PATH_IMAGE002
Figure 585289DEST_PATH_IMAGE003
representing the coordinates of the pixel points in the acquired image before mapping,
Figure 825778DEST_PATH_IMAGE004
Figure 789798DEST_PATH_IMAGE005
the coordinates of the pixel points in the mapped aerial view are obtained;
Figure 469041DEST_PATH_IMAGE006
the numerical value of the element is obtained according to the mapping process that the four vertexes of the calibration square in the collected image are mapped to the four vertexes of the calibration square in the aerial view;
step 3.4: calculating mapping matrixes of other pixel points except the pixel points occupied by the target to be measured in the collected image according to the mapping matrixes of the four vertexes of the square marked in the step 3.2 and the step 3.3, and mapping the pixel points to the aerial view to obtain a blank aerial view of the target to be measured;
step 3.5: directly copying the target to be measured determined in the step 3.1 to a blank area of the aerial view at the target to be measured, so that the bottom edge of the target to be measured determined in the step 3.1 is superposed with the middle point of the bottom edge of the hollow white area of the aerial view;
step 3.6: and calculating the actual length represented by each pixel point according to the number of the pixel points occupied by the perimeter of the marked square in the aerial view and the actual length of the perimeter.
And 4, step 4: the reference position is the midpoint of the marked square in the aerial view; calculating an ellipse where a target to be measured is located, wherein the ellipse takes a reference position as an ellipse center, the bottom edge of a marked square in the aerial view is the edge where the long axis of the ellipse is located, and the long axis of the ellipse is 2 times of the short axis; determining half of the short axis of the ellipse as the distance from the target to be measured to the reference position, and calculating the actual distance from the target to be measured to the reference position according to the number of pixel points occupied by the half of the short axis and the actual distance represented by each pixel point; as shown in fig. 6, the present embodiment draws elliptical equidistant lines in the captured image in steps of 5 meters.

Claims (6)

1. A monocular camera ranging method adopting laser-assisted calibration is characterized by comprising the following steps:
step 1: calibrating a rectangle with known side length on a front plane by adopting a laser projection mode to obtain a calibration rectangle;
step 2: acquiring an image containing a calibration rectangle and a target to be measured with a monocular camera to obtain an acquired image, wherein the calibration rectangle in the acquired image is displayed as a trapezoid;
and step 3: mapping the acquired image obtained in the step 2 into a bird-eye view, wherein the length-width ratio of the calibration rectangle in the bird-eye view is the same as the length-width ratio of the calibration rectangle in the step 1; calculating the actual distance represented by each pixel point according to the number of the pixel points occupied by the perimeter of the calibration rectangle in the aerial view and the actual length of the perimeter;
and 4, step 4: selecting different calculation methods according to the property of the target to be measured and the ranging environment, wherein the calculation methods comprise the following steps:
a. if the target to be measured is an industrial production line or a static distance measurement environment, the reference position is the bottom side of the calibration rectangle in the aerial view and the extension line of the bottom side; calculating the number of pixels from the target to be measured to the reference position in the image, and calculating the actual distance from the target to be measured to the reference position according to the actual distance represented by each pixel point;
b. if the monocular camera is positioned on the vehicle and the target to be measured is a small car, the reference position is the middle point of the bottom edge of the standard rectangular shape in the aerial view; calculating the number of pixels from the target to be measured to the reference position in the image, and calculating the actual distance from the target to be measured to the reference position according to the actual distance represented by each pixel point;
c. if the monocular camera is positioned on the vehicle and the target to be measured is a large transport vehicle or a large passenger car, the reference position is the middle point of the calibration rectangle in the aerial view; calculating an ellipse where a target to be measured is located, wherein the ellipse takes the reference position as the center of the ellipse, the bottom edge of the standard rectangle in the aerial view is the edge where the long axis of the ellipse is located, and the long axis of the ellipse is 2 times of the short axis; and determining half of the short axis of the ellipse as the distance from the target to be measured to the reference position, and calculating the actual distance from the target to be measured to the reference position according to the number of pixel points occupied by the half of the short axis and the actual distance represented by each pixel point.
2. The monocular camera ranging method using laser-assisted calibration according to claim 1, wherein the calibration rectangle in step 1 is a square.
3. The monocular camera ranging method with laser-assisted calibration according to claim 1, wherein the specific method in step 3 is as follows:
step 3.1: establishing a blank graph of the aerial view, directly designating four vertexes of a calibration rectangle in the aerial view in the blank graph, wherein the length-width ratio of the rectangle formed by the designated four vertexes is the same as the length-width ratio of the calibration rectangle in the step 1;
step 3.2: sequentially and correspondingly mapping the four vertexes of the calibration rectangle in the image collected in the step 2 to the four vertexes specified in the blank image by adopting the following formula;
Figure 586413DEST_PATH_IMAGE001
Figure 196386DEST_PATH_IMAGE002
Figure 225522DEST_PATH_IMAGE003
representing the coordinates of the pixel points in the acquired image before mapping,
Figure 777594DEST_PATH_IMAGE004
,
Figure 993812DEST_PATH_IMAGE005
the coordinates of the pixel points in the mapped aerial view are obtained;
Figure 407476DEST_PATH_IMAGE006
the numerical value of the element is obtained according to the mapping process that the four vertexes of the calibration rectangle in the collected image are mapped to the four vertexes of the calibration rectangle in the aerial view;
step 3.3: calculating mapping matrixes of other pixel points in the collected image according to the mapping matrixes of the four vertexes calculated in the step 3.2, and mapping the pixel points to the aerial view;
step 3.4: and calculating the actual distance represented by each pixel point according to the number of the pixel points occupied by the perimeter of the calibration rectangle in the aerial view and the actual perimeter length.
4. The monocular camera ranging method with laser-assisted calibration according to claim 1, wherein the specific method in step 3 is as follows:
step 3.1: establishing a blank graph of the aerial view, and directly copying a left lower vertex and a right lower vertex of a calibration rectangle in the acquired image into the aerial view without mapping transformation;
step 3.2: determining the positions of the left upper vertex and the right upper vertex of the calibration rectangle in the aerial view according to the length-width ratio of the calibration rectangle in the step 1; then, mapping is carried out by adopting the following formula to obtain a mapping matrix:
Figure 291118DEST_PATH_IMAGE001
Figure 764825DEST_PATH_IMAGE002
Figure 219071DEST_PATH_IMAGE003
representing the coordinates of the pixel points in the acquired image before mapping,
Figure 170847DEST_PATH_IMAGE004
,
Figure 908996DEST_PATH_IMAGE005
the coordinates of the pixel points in the mapped aerial view are obtained;
Figure 553604DEST_PATH_IMAGE006
the numerical value of the element is obtained according to the mapping process that the four vertexes of the calibration rectangle in the collected image are mapped to the four vertexes of the calibration rectangle in the aerial view;
step 3.3: calculating mapping matrixes of other pixel points in the collected image according to the mapping matrixes of the four vertexes in the step 3.1 and the step 3.2, and mapping the pixel points to the aerial view;
step 3.4: and calculating the actual distance represented by each pixel point according to the number of the pixel points occupied by the perimeter of the calibration rectangle in the aerial view and the actual length of the perimeter.
5. The monocular camera ranging method with laser-assisted calibration according to claim 1, wherein the specific method in step 3 is as follows:
step 3.1: identifying a target to be measured in the collected image, and determining pixel points occupied by the target to be measured and the lower bottom edge of the target to be measured;
step 3.2: establishing a blank graph of the aerial view, and directly copying a left lower vertex and a right lower vertex of a calibration rectangle in the acquired image into the aerial view without mapping transformation;
step 3.3: determining the positions of the upper left vertex and the upper right vertex of the calibration rectangle in the aerial view according to the length-width ratio of the calibration rectangle in the step 1; mapping is carried out by adopting the following formula to obtain a mapping matrix:
Figure 9993DEST_PATH_IMAGE001
Figure 14727DEST_PATH_IMAGE002
Figure 607382DEST_PATH_IMAGE003
representing the coordinates of the pixel points in the acquired image before mapping,
Figure 422891DEST_PATH_IMAGE004
,
Figure 100997DEST_PATH_IMAGE005
the coordinates of the pixel points in the mapped aerial view are obtained;
Figure 942045DEST_PATH_IMAGE006
is a mapping matrix in which the values of the elements are mapped to four calibration rectangles in the aerial view according to four vertexes of the calibration rectangles in the acquired imageSolving the mapping process of each vertex;
step 3.4: calculating mapping matrixes of other pixel points in the collected image except the pixel points occupied by the target to be measured according to the mapping matrixes of the four vertexes in the step 3.2 and the step 3.3, and mapping the pixel points to the aerial view to obtain a blank aerial view of the target to be measured;
step 3.5: directly copying the target to be measured determined in the step 3.1 into a blank area at the target to be measured in the aerial view without mapping transformation, so that the middle point of the lower bottom edge of the target to be measured determined in the step 3.1 is superposed with the middle point of the lower bottom edge of the hollow white area in the aerial view;
step 3.6: and calculating the actual distance represented by each pixel point according to the number of the pixel points occupied by the perimeter of the calibration rectangle in the aerial view and the actual perimeter length.
6. The monocular camera ranging method according to claim 1 or 5, wherein the distance between the target to be measured and the reference position in step 5 represents the distance between the midpoint of the lower edge of the target to be measured and the reference position.
CN202010471623.2A 2020-05-29 2020-05-29 Monocular camera ranging method adopting laser-assisted calibration Active CN111380503B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010471623.2A CN111380503B (en) 2020-05-29 2020-05-29 Monocular camera ranging method adopting laser-assisted calibration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010471623.2A CN111380503B (en) 2020-05-29 2020-05-29 Monocular camera ranging method adopting laser-assisted calibration

Publications (2)

Publication Number Publication Date
CN111380503A true CN111380503A (en) 2020-07-07
CN111380503B CN111380503B (en) 2020-09-25

Family

ID=71222947

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010471623.2A Active CN111380503B (en) 2020-05-29 2020-05-29 Monocular camera ranging method adopting laser-assisted calibration

Country Status (1)

Country Link
CN (1) CN111380503B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112733678A (en) * 2020-12-31 2021-04-30 深兰人工智能(深圳)有限公司 Ranging method, ranging device, computer equipment and storage medium
WO2023273108A1 (en) * 2021-06-30 2023-01-05 深圳市优必选科技股份有限公司 Monocular distance measurement method and apparatus, and intelligent apparatus

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012038485A1 (en) * 2010-09-22 2012-03-29 Henesis S.R.L. Pantograph monitoring system and method
US20130236063A1 (en) * 2012-03-07 2013-09-12 Xerox Corporation Multiple view transportation imaging systems
CN105856262A (en) * 2016-05-16 2016-08-17 清华大学 Method for detecting and recognizing object by small robot through touch sense
CN107389026A (en) * 2017-06-12 2017-11-24 江苏大学 A kind of monocular vision distance-finding method based on fixing point projective transformation
CN108596980A (en) * 2018-03-29 2018-09-28 中国人民解放军63920部队 Circular target vision positioning precision assessment method, device, storage medium and processing equipment
CN109343041A (en) * 2018-09-11 2019-02-15 昆山星际舟智能科技有限公司 The monocular distance measuring method driven for high-grade intelligent auxiliary
US20190166314A1 (en) * 2017-11-30 2019-05-30 International Business Machines Corporation Ortho-selfie distortion correction using multiple sources
CN109949355A (en) * 2019-03-14 2019-06-28 大连民族大学 The method of half fan-shaped equidistant line model is established in monocular vision pedestrian's distance estimations
CN110174088A (en) * 2019-04-30 2019-08-27 上海海事大学 A kind of target ranging method based on monocular vision
CN110399762A (en) * 2018-04-24 2019-11-01 北京四维图新科技股份有限公司 A kind of method and device of the lane detection based on monocular image
CN110672020A (en) * 2019-06-14 2020-01-10 浙江农林大学 Stand tree height measuring method based on monocular vision
US20200105017A1 (en) * 2018-09-30 2020-04-02 Boe Technology Group Co., Ltd. Calibration method and calibration device of vehicle-mounted camera, vehicle and storage medium
CN110991264A (en) * 2019-11-12 2020-04-10 浙江鸿泉车联网有限公司 Front vehicle detection method and device
CN111080707A (en) * 2019-10-17 2020-04-28 深圳亿智时代科技有限公司 Monocular panoramic parking calibration method
CN111192235A (en) * 2019-12-05 2020-05-22 中国地质大学(武汉) Image measuring method based on monocular vision model and perspective transformation

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012038485A1 (en) * 2010-09-22 2012-03-29 Henesis S.R.L. Pantograph monitoring system and method
US20130236063A1 (en) * 2012-03-07 2013-09-12 Xerox Corporation Multiple view transportation imaging systems
CN105856262A (en) * 2016-05-16 2016-08-17 清华大学 Method for detecting and recognizing object by small robot through touch sense
CN107389026A (en) * 2017-06-12 2017-11-24 江苏大学 A kind of monocular vision distance-finding method based on fixing point projective transformation
US20190166314A1 (en) * 2017-11-30 2019-05-30 International Business Machines Corporation Ortho-selfie distortion correction using multiple sources
CN108596980A (en) * 2018-03-29 2018-09-28 中国人民解放军63920部队 Circular target vision positioning precision assessment method, device, storage medium and processing equipment
CN110399762A (en) * 2018-04-24 2019-11-01 北京四维图新科技股份有限公司 A kind of method and device of the lane detection based on monocular image
CN109343041A (en) * 2018-09-11 2019-02-15 昆山星际舟智能科技有限公司 The monocular distance measuring method driven for high-grade intelligent auxiliary
US20200105017A1 (en) * 2018-09-30 2020-04-02 Boe Technology Group Co., Ltd. Calibration method and calibration device of vehicle-mounted camera, vehicle and storage medium
CN109949355A (en) * 2019-03-14 2019-06-28 大连民族大学 The method of half fan-shaped equidistant line model is established in monocular vision pedestrian's distance estimations
CN110174088A (en) * 2019-04-30 2019-08-27 上海海事大学 A kind of target ranging method based on monocular vision
CN110672020A (en) * 2019-06-14 2020-01-10 浙江农林大学 Stand tree height measuring method based on monocular vision
CN111080707A (en) * 2019-10-17 2020-04-28 深圳亿智时代科技有限公司 Monocular panoramic parking calibration method
CN110991264A (en) * 2019-11-12 2020-04-10 浙江鸿泉车联网有限公司 Front vehicle detection method and device
CN111192235A (en) * 2019-12-05 2020-05-22 中国地质大学(武汉) Image measuring method based on monocular vision model and perspective transformation

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
J.HAN ET AL.: ""Vehicle Distance Estimation Using a Mono-Camera for FCW/AEB Systems"", 《INTERNATIONAL JOURNAL OF AUTOMOTIVE TECHNOLOGY》 *
PREAW WONGSAREE ET AL.: ""Distance Detection Technique Using Enhancing Inverse Perspective Mapping"", 《2018 3RD INTERNATIONAL CONFERENCE ON COMPUTER AND COMMUNICATION SYSTEMS》 *
刘兵兵: ""基于单目视觉的行车状态目标检测与测距"", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
陈琦: ""基于单目视觉的实时车距测量方法研究"", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112733678A (en) * 2020-12-31 2021-04-30 深兰人工智能(深圳)有限公司 Ranging method, ranging device, computer equipment and storage medium
WO2023273108A1 (en) * 2021-06-30 2023-01-05 深圳市优必选科技股份有限公司 Monocular distance measurement method and apparatus, and intelligent apparatus

Also Published As

Publication number Publication date
CN111380503B (en) 2020-09-25

Similar Documents

Publication Publication Date Title
CN110148169B (en) Vehicle target three-dimensional information acquisition method based on PTZ (pan/tilt/zoom) pan-tilt camera
CN106570904B (en) A kind of multiple target relative pose recognition methods based on Xtion camera
CN110031829B (en) Target accurate distance measurement method based on monocular vision
CN110842940A (en) Building surveying robot multi-sensor fusion three-dimensional modeling method and system
CN103065323B (en) Subsection space aligning method based on homography transformational matrix
CN109767473B (en) Panoramic parking device calibration method and device
CN112037159B (en) Cross-camera road space fusion and vehicle target detection tracking method and system
US20210342620A1 (en) Geographic object detection apparatus and geographic object detection method
CN110307791B (en) Vehicle length and speed calculation method based on three-dimensional vehicle boundary frame
WO2015019526A1 (en) Image processing device and markers
CN103729837A (en) Rapid calibration method of single road condition video camera
CN111207762B (en) Map generation method and device, computer equipment and storage medium
CN111272139B (en) Monocular vision-based vehicle length measuring method
CN111380503B (en) Monocular camera ranging method adopting laser-assisted calibration
CN112648976B (en) Live-action image measuring method and device, electronic equipment and storage medium
CN113834492A (en) Map matching method, system, device and readable storage medium
CN114283391A (en) Automatic parking sensing method fusing panoramic image and laser radar
CN110108269A (en) AGV localization method based on Fusion
CN113327296B (en) Laser radar and camera online combined calibration method based on depth weighting
CN111476798B (en) Vehicle space morphology recognition method and system based on contour constraint
CN110415299B (en) Vehicle position estimation method based on set guideboard under motion constraint
CN113465573A (en) Monocular distance measuring method and device and intelligent device
CN115201883A (en) Moving target video positioning and speed measuring system and method
CN117496467A (en) Special-shaped lane line detection method based on fusion of monocular camera and 3D LIDAR
CN115272474A (en) Three-dimensional calibration plate for combined calibration of laser radar and camera and calibration method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant