CN113720299B - Ranging method based on sliding scene of three-dimensional camera or monocular camera on guide rail - Google Patents

Ranging method based on sliding scene of three-dimensional camera or monocular camera on guide rail Download PDF

Info

Publication number
CN113720299B
CN113720299B CN202111102827.XA CN202111102827A CN113720299B CN 113720299 B CN113720299 B CN 113720299B CN 202111102827 A CN202111102827 A CN 202111102827A CN 113720299 B CN113720299 B CN 113720299B
Authority
CN
China
Prior art keywords
graph
far
camera
distance
lens
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111102827.XA
Other languages
Chinese (zh)
Other versions
CN113720299A (en
Inventor
王方聪
石珞家
辛纪潼
查美怡
王鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lanzhou University
Original Assignee
Lanzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lanzhou University filed Critical Lanzhou University
Priority to CN202111102827.XA priority Critical patent/CN113720299B/en
Publication of CN113720299A publication Critical patent/CN113720299A/en
Application granted granted Critical
Publication of CN113720299B publication Critical patent/CN113720299B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a ranging method based on a sliding scene of a three-dimensional camera or a monocular camera on a guide rail, which comprises the following steps: a monocular camera slides on an optical guide rail or a group of far images and near images on the same optical axis are acquired by a three-dimensional camera; scaling the near graph to obtain a series of continuous scaled graphs; binarizing the far graph and the zoom graph, and extracting edges to obtain a binarized outline graph of the far graph and the near graph; a series of close-in maps and far-in maps are subjected to rectangular convolution in sequence; comparing all the convolution values to obtain a maximum convolution value, reading the positions of two matrixes where the maximum convolution value is located, and scaling the size of the near graph obtained when the two matrixes coincide; and reserving the part of the far graph overlapped with the scaling graph corresponding to the maximum convolution value, and forming a group of images with the same information amount with the original near graph. The present invention is a more accurate and inexpensive camera ranging scheme.

Description

Ranging method based on sliding scene of three-dimensional camera or monocular camera on guide rail
Technical Field
The present invention relates to the field of ranging.
Background
The ranging algorithm has wide application in the fields of industrial detection, medical treatment, traffic, automatic driving building design, aerospace, virtual reality and the like. In scenes such as autopilot and unmanned aerial vehicle flight, the range finding of the camera is compared with the range finding modes such as radar, laser and laser radar, and the like, so that the method has huge cost advantages.
Traditional camera ranging is divided into monocular ranging and binocular ranging. Along with the increase of the distance, the error is increased continuously, the distance measurement cannot be carried out remotely, and the difficulty of using is great. The traditional monocular ranging requires calibration of the camera, which is very cumbersome and almost certainly introduces errors due to the calibration. The current ranging method has problems and needs improvement.
Disclosure of Invention
The invention aims to provide a ranging method based on a sliding scene of a three-dimensional camera or a monocular camera on a guide rail, so as to obtain a more accurate and cheap camera ranging scheme.
The technical scheme for achieving the purpose is as follows:
a ranging method based on a sliding scene of a three-dimensional camera or a monocular camera on a guide rail, comprising:
step S1, a monocular camera slides on an optical guide rail or a group of far images and near images on the same optical axis are acquired by a three-dimensional camera;
step S2, downsampling is carried out through a linear interpolation algorithm, and the near graph is scaled to obtain a series of continuous scaled graphs;
step S3, binarizing the far graph and the zoom graph, and extracting edges to obtain a binarized outline graph of the far graph and the near graph;
step S4, continuously moving a series of contour maps of the near graph as operators, convolving once each time, and performing rectangular convolution on the contour maps of the near graph and the contour maps of the far graph in sequence;
s5, comparing all the convolution values to obtain a maximum convolution value, reading the positions of two matrixes where the maximum convolution value is located, and scaling the near graph size obtained when the two matrixes coincide;
step S6, reserving the part of the far graph overlapped with the zoom graph corresponding to the maximum convolution value, and forming a group of images with equal information quantity with the original near graph;
step S7, performing coordinate conversion: converting the rectangular coordinate of the pixel with the origin at the upper left corner of the image into the polar coordinate with the extreme point at the center of the image, and taking the coordinate system as a standard;
step S8, respectively using a SIFT (scale invariant feature transform) corner method or a SURF (scale invariant feature transform acceleration robust feature) corner method for the far graph and the near graph, and sequentially outputting coordinates of the corner points under polar coordinates;
step S9, if the ordered angular points in the far graph and the near graph are in a certain threshold range according to the value of the polar coordinates, matching the angular points;
step S10, classifying corner points according to the closed edges of the contour map and the subordination of objects;
step S11, connecting the corner points of the subordinate identical object in the identical image, obtaining the matching relation of the line segments according to the position information of the matching corner points, selecting the length of a pair of matching line segments with the longest length, or taking the average value of the lengths of the matching line segments of the far graph and the near graph, or respectively taking the average value of the polar coordinate lengths of the corner points of the subordinate identical object of the far graph and the near graph;
step S12, substituting the obtained length or average value into a corresponding optical relation according to the scene to solve, and obtaining the object distance;
and S13, classifying or judging the dependence relationship of the corner points according to the outline of the object, and representing the distance from the lens by the object distance according to the dependence relationship of the corner points.
Preferably, in step S12, the scenario refers to: a single-view camera sliding on an optical rail, or a three-dimensional camera.
Preferably, the three-dimensional camera means: a portable three-dimensional camera with small common virtual axis by utilizing the monocular ranging principle.
Preferably, in step S4, after traversing, the overlapping portion of the pixel positions is multiplied to obtain 1, and the multiplication of the non-overlapping positions may obtain 0 or may obtain 1.
Preferably, in step S6, or the near-graph obtained by scaling and the near-graph not scaled when overlapping form a group of images with the same information amount.
Preferably, in step S9, or a method using the ratio between the closest distance and the next closest distance, a threshold is set, and the ratio between the closest distance and the next closest distance is below the threshold, and corner matching is performed while unnecessary points are removed.
Preferably, in the scenario where the monocular camera slides on the optical rail,
assuming that the object distance at the first imaging is u, and the object distance at the second imaging is u+d; the length value obtained by the first imaging of the object is h 1 The length value obtained when the object is imaged for the second time is h 2 The method comprises the steps of carrying out a first treatment on the surface of the Since parameters of the monocular camera remain unchanged during two imaging, a formula obtained according to an optical imaging relationship can be obtained:
Figure BDA0003270347380000031
and calculating to obtain the object distance u.
Preferably, in a small-sized common virtual axis three-dimensional camera scene utilizing a monocular ranging principle, L1 is taken as a near image, and L2 is taken as a far image;
assuming that L1 is the distance between the center of the 50% reflecting mirror and the upper lens, and L2 is the distance between the center of the total reflecting mirror and the lower lens; the length value in the first lens is d1, and the length value in the second lens is d2; a distance h between the first lens optical axis and the second lens optical axis; the distance L'1 between the object and the first lens; since the first lens and the second lens have identical focal length and viewing angle θ, the formula obtained according to the optical imaging relationship is:
Figure BDA0003270347380000032
the object distance L'1 is calculated.
The beneficial effects of the invention are as follows: the invention avoids directly using the sift corner method, and does not bring a great deal of errors in corner selection due to the information difference between front and rear views. The invention eliminates the error of the information quantity of the far and near images, further reduces the error in the processes of corner matching and removing, and finally reduces the algorithm error in the process of selecting the parameters as the length average value or selecting the length of the longest line segment. The ranging algorithm with low cost, small error and wide application range in real time has cost advantages, solves the problem that the target object for monocular ranging and the binocular range cannot well measure the object far away, and can well serve the small-size portable three-dimensional camera with the common virtual axis and the ranging scene of the monocular camera sliding on the guide rail. The method can effectively measure the distance and prepare materials for three-dimensional reconstruction in the near step.
Drawings
FIG. 1 is a flow chart of a ranging method based on a three-dimensional or monocular camera sliding scene on a rail of the present invention;
fig. 2 is a block diagram of one embodiment of a small virtual coaxial portable three-dimensional camera of the present invention.
Detailed Description
The invention will be further described with reference to the accompanying drawings.
Referring to fig. 1 and 2, the ranging method based on a sliding scene of a three-dimensional camera or a monocular camera on a guide rail of the present invention includes the following steps:
step S1, a monocular camera slides on an optical guide rail or a group of far images and near images on the same optical axis are acquired by a three-dimensional camera. In the shooting process of the far graph and the near graph, camera parameters are kept consistent.
Step S2, downsampling is carried out through a linear interpolation algorithm, and the near graph is scaled to obtain a series of continuous scaled graphs. The scaling map is continually reduced in scale. The interpolation algorithm comprises bilinear interpolation, nearest neighbor interpolation and cubic interpolation, and one interpolation algorithm is used for image scaling. The scaling is adjusted according to the same difference, and the image scaling size obtained each time is reduced according to the same difference or the scaling is reduced according to the structure of the Gaussian pyramid.
And S3, binarizing the far graph and the zoom graph, and extracting edges to obtain a binarized outline graph of the far graph and the near graph. The edge extraction method comprises sobel, canny and Laplacian method, and one of the methods is used for edge extraction.
And S4, continuously moving a series of contour maps of the near graph as operators, convolving once every movement, and sequentially performing rectangular convolution on the contour maps of the near graph and the contour maps of the far graph. And outputting a value every time the matrix moves, mainly comparing the magnitude of the output value of each position, and comparing the magnitude of the output value by using the magnitude of the value, wherein each value corresponds to the relative position information of a far-near graph. After traversing, the overlapping part of one pixel position is multiplied to obtain 1, and the multiplication of the non-overlapping position can obtain 0 or 1. In one convolution, the output values of all the image pixel points are accumulated to obtain one convolution result
And S5, comparing all the convolution values to obtain a maximum convolution value, reading the positions of the two matrixes where the maximum convolution value is located, and scaling the near graph size when the two matrixes coincide.
Step S6, reserving the part of the far graph overlapped with the zoom graph corresponding to the maximum convolution value, and forming a group of images with equal information quantity with the original near graph; or the near graph obtained by scaling and the non-scaled near graph form a group of images with the same information content when overlapping.
Step S7, performing coordinate conversion: converting the rectangular coordinate of the pixel with the origin at the upper left corner of the image into the polar coordinate with the extreme point at the center of the image, and taking the coordinate system as a standard;
step S8, respectively using a SIFT angular point method or a SURF angular point method for the far graph and the near graph, and sequentially outputting coordinates of the angular points under polar coordinates;
step S9, if the ordered angular points in the far graph and the near graph are in a certain threshold range according to the value of the polar coordinates, matching the angular points; or setting a threshold by using a method of the ratio between the nearest distance and the next closest distance, wherein the ratio between the nearest distance and the next closest distance is below the threshold, performing corner matching, and removing unnecessary points.
Step S10, classifying the corner points according to the closed edges of the contour map and the subordination of the objects.
Step S11, connecting the corner points of the subordinate identical object in the identical image, obtaining the matching relation of the line segments according to the position information of the matching corner points, selecting the length of a pair of matching line segments with the longest length, or taking the average value of the lengths of the matching line segments of the far graph and the near graph, or respectively taking the average value of the polar coordinate lengths of the corner points of the subordinate identical object of the far graph and the near graph.
And S12, substituting the obtained length or average value into a corresponding optical relation according to the scene to solve so as to obtain the object distance. The scene refers to: a single-view camera sliding on an optical rail, or a three-dimensional camera. Three-dimensional camera refers to: a portable three-dimensional camera with small common virtual axis by utilizing the monocular ranging principle.
In the scenario where the monocular camera slides on the optical rail,
assuming that the object distance at the first imaging is u, and the object distance at the second imaging is u+d; first time of objectThe length value obtained by imaging is h 1 The length value obtained when the object is imaged for the second time is h 2 The method comprises the steps of carrying out a first treatment on the surface of the Since parameters of the monocular camera remain unchanged during two imaging, a formula obtained according to an optical imaging relationship can be obtained:
Figure BDA0003270347380000051
and calculating to obtain the object distance u. d is a directly readable quantity in the optical guide, h 1 And h 2 Has been printed or read out and the parameters are substituted into the above equation to obtain the object distance u.
In a small-sized common virtual axis three-dimensional camera scenario using the monocular ranging principle,
assuming that L1 is the distance between the center of the 50% reflecting mirror and the upper lens, and L2 is the distance between the center of the total reflecting mirror and the lower lens; the length value in the first lens is d1, and the length value in the second lens is d2; a distance h between the first lens optical axis and the second lens optical axis; the distance L'1 between the object and the first lens; since the first lens and the second lens have identical focal length and viewing angle θ, the formula obtained according to the optical imaging relationship is:
Figure BDA0003270347380000052
the object distance L'1 is calculated. L1 and L2 and h are known quantities, d1 and d2 have been printed or read, and parameters are substituted into the above equation to obtain the object distance L'1.
And S13, classifying or judging the dependence relationship of the corner points according to the outline of the object, and representing the distance from the lens by the object distance according to the dependence relationship of the corner points. Wherein the object is at a distance from the line of corner points to the lens.
In fig. 2, each sequence number indicates: a first lens 1; a second lens 2; a spectroscope 3; a total reflection mirror 4; a target object 5; a distance h 6 between the first lens optical axis and the second lens optical axis; a first lens optical axis 7; the second lens optical axis 8.
The above embodiments are provided for illustrating the present invention and not for limiting the present invention, and various changes and modifications may be made by one skilled in the relevant art without departing from the spirit and scope of the present invention, and thus all equivalent technical solutions should be defined by the claims.

Claims (5)

1. The ranging method based on the sliding scene of the three-dimensional camera or the monocular camera on the guide rail is characterized by comprising the following steps:
step S1, a monocular camera slides on an optical guide rail or a group of far images and near images on the same optical axis are acquired by a three-dimensional camera;
step S2, downsampling is carried out through a linear interpolation algorithm, and the near graph is scaled to obtain a series of continuous scaled graphs;
step S3, binarizing the far graph and the zoom graph, and extracting edges to obtain a binarized outline graph of the far graph and the near graph;
step S4, continuously moving a series of contour maps of the near graph as operators, convolving once each time, and performing rectangular convolution on the contour maps of the near graph and the contour maps of the far graph in sequence;
s5, comparing all the convolution values to obtain a maximum convolution value, reading the positions of two matrixes where the maximum convolution value is located, and scaling the near graph size obtained when the two matrixes coincide;
step S6, reserving the part of the far graph overlapped with the zoom graph corresponding to the maximum convolution value, and forming a group of images with equal information quantity with the original near graph;
step S7, performing coordinate conversion: converting the rectangular coordinate of the pixel with the origin at the upper left corner of the image into the polar coordinate with the extreme point at the center of the image, and taking the coordinate system as a standard;
step S8, respectively using a SIFT angular point method or a SURF angular point method for the far graph and the near graph, and sequentially outputting coordinates of the angular points under polar coordinates;
step S9, if the ordered angular points in the far graph and the near graph are in a certain threshold range according to the value of the polar coordinates, matching the angular points;
step S10, classifying corner points according to the closed edges of the contour map and the subordination of objects;
step S11, connecting the corner points of the subordinate identical object in the identical image, obtaining the matching relation of the line segments according to the position information of the matching corner points, selecting the length of a pair of matching line segments with the longest length, or taking the average value of the lengths of the matching line segments of the far graph and the near graph, or respectively taking the average value of the polar coordinate lengths of the corner points of the subordinate identical object of the far graph and the near graph;
step S12, substituting the obtained length or average value into a corresponding optical relation according to the scene to solve, and obtaining the object distance;
step S13, classifying or judging the dependence relationship of the angular points according to the outline of the object, and representing the distance from the lens by the object distance according to the dependence relationship of the angular points;
in step S12, the scenario refers to: a scene in which a monocular camera slides on an optical guide, or a three-dimensional camera scene;
in the scenario where the monocular camera slides on the optical rail,
assuming that the object distance at the first imaging is u, and the object distance at the second imaging is u+d; the length value obtained by the first imaging of the object is h 1 The length value obtained when the object is imaged for the second time is h 2 The method comprises the steps of carrying out a first treatment on the surface of the Since parameters of the monocular camera remain unchanged during two imaging, a formula obtained according to an optical imaging relationship can be obtained:
Figure FDA0004265910930000021
calculating to obtain an object distance u;
in a small-sized common virtual axis three-dimensional camera scenario using the monocular ranging principle,
assuming that L1 is the distance between the center of the 50% reflecting mirror and the upper lens, and L2 is the distance between the center of the total reflecting mirror and the lower lens; the length value in the first lens is d1, and the length value in the second lens is d2; a distance h between the first lens optical axis and the second lens optical axis; the distance L'1 between the object and the first lens; since the first lens and the second lens have identical focal length and viewing angle θ, the formula obtained according to the optical imaging relationship is:
Figure FDA0004265910930000022
the object distance L'1 is calculated.
2. The ranging method based on a sliding scene of a three-dimensional camera or a monocular camera on a guide rail according to claim 1, wherein the three-dimensional camera means: a portable three-dimensional camera with small common virtual axis by utilizing the monocular ranging principle.
3. The ranging method based on a sliding scene on a guide rail by a three-dimensional camera or a monocular camera according to claim 1, wherein in step S4, after traversing one pass, the multiplication of the overlapping parts of one pixel position is obtained by 1, and the multiplication of the non-overlapping positions is obtained by 0 or 1.
4. The ranging method based on a sliding scene on a guide rail by a three-dimensional camera or a monocular camera according to claim 1, wherein in step S6, or the zoomed near-image and the non-zoomed near-image when overlapping form a group of images with the same information amount.
5. The ranging method based on a sliding scene of a three-dimensional camera or a monocular camera on a rail according to claim 1, wherein in step S9, or using a method of a ratio between a closest distance and a next closest distance, a threshold value is set, the ratio between the closest distance and the next closest distance is below the threshold value, corner matching is performed while unnecessary points are removed.
CN202111102827.XA 2021-09-18 2021-09-18 Ranging method based on sliding scene of three-dimensional camera or monocular camera on guide rail Active CN113720299B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111102827.XA CN113720299B (en) 2021-09-18 2021-09-18 Ranging method based on sliding scene of three-dimensional camera or monocular camera on guide rail

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111102827.XA CN113720299B (en) 2021-09-18 2021-09-18 Ranging method based on sliding scene of three-dimensional camera or monocular camera on guide rail

Publications (2)

Publication Number Publication Date
CN113720299A CN113720299A (en) 2021-11-30
CN113720299B true CN113720299B (en) 2023-07-14

Family

ID=78684462

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111102827.XA Active CN113720299B (en) 2021-09-18 2021-09-18 Ranging method based on sliding scene of three-dimensional camera or monocular camera on guide rail

Country Status (1)

Country Link
CN (1) CN113720299B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105627926A (en) * 2016-01-22 2016-06-01 尹兴 Four-camera group planar array feature point three-dimensional measurement system and measurement method
CN109146980A (en) * 2018-08-12 2019-01-04 浙江农林大学 The depth extraction and passive ranging method of optimization based on monocular vision
CN109489620A (en) * 2019-01-12 2019-03-19 内蒙古农业大学 A kind of monocular vision distance measuring method
CN111982072A (en) * 2020-07-29 2020-11-24 西北工业大学 Target ranging method based on monocular vision
CN112330740A (en) * 2020-10-28 2021-02-05 华北电力大学(保定) Pseudo-binocular dynamic distance measurement method based on monocular video

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105627926A (en) * 2016-01-22 2016-06-01 尹兴 Four-camera group planar array feature point three-dimensional measurement system and measurement method
WO2017124654A1 (en) * 2016-01-22 2017-07-27 尹兴 Three-dimensional measurement system and measurement method for feature point based on plane of four-camera set array
CN109146980A (en) * 2018-08-12 2019-01-04 浙江农林大学 The depth extraction and passive ranging method of optimization based on monocular vision
CN109489620A (en) * 2019-01-12 2019-03-19 内蒙古农业大学 A kind of monocular vision distance measuring method
CN111982072A (en) * 2020-07-29 2020-11-24 西北工业大学 Target ranging method based on monocular vision
CN112330740A (en) * 2020-10-28 2021-02-05 华北电力大学(保定) Pseudo-binocular dynamic distance measurement method based on monocular video

Also Published As

Publication number Publication date
CN113720299A (en) 2021-11-30

Similar Documents

Publication Publication Date Title
CN108053367B (en) 3D point cloud splicing and fusion method based on RGB-D feature matching
CN107977997B (en) Camera self-calibration method combined with laser radar three-dimensional point cloud data
CN107993258B (en) Image registration method and device
CN112444242B (en) Pose optimization method and device
CN115082924B (en) Three-dimensional target detection method based on monocular vision and radar pseudo-image fusion
CN103700099B (en) Rotation and dimension unchanged wide baseline stereo matching method
CN114693760A (en) Image correction method, device and system and electronic equipment
CN109544635B (en) Camera automatic calibration method based on enumeration heuristic
CN114898321B (en) Road drivable area detection method, device, equipment, medium and system
CN111325828A (en) Three-dimensional face acquisition method and device based on three-eye camera
CN114742875A (en) Binocular stereo matching method based on multi-scale feature extraction and self-adaptive aggregation
CN111951339A (en) Image processing method for performing parallax calculation by using heterogeneous binocular cameras
CN113538545B (en) Monocular depth estimation method based on electro-hydraulic adjustable-focus lens and corresponding camera and storage medium
CN113034666B (en) Stereo matching method based on pyramid parallax optimization cost calculation
CN113720299B (en) Ranging method based on sliding scene of three-dimensional camera or monocular camera on guide rail
KR101673144B1 (en) Stereoscopic image registration method based on a partial linear method
CN114638898A (en) Small-sized flight target detection method and device
CN114387532A (en) Boundary identification method and device, terminal, electronic equipment and unmanned equipment
CN106709942A (en) Panoramic image mistaken matching elimination method based on characteristic azimuth
Karaca et al. Ground-based panoramic stereo hyperspectral imaging system with multiband stereo matching
CN112229381A (en) Smart phone ranging method using arm length and camera
Zhuo et al. Stereo matching approach using zooming images
CN113393501A (en) Method and system for determining matching parameters of road image and point cloud data and related equipment
Su Vanishing points in road recognition: A review
CN114067130B (en) Feature point extraction method and system for airborne photoelectric image template matching

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant