CN116051616A - Depth measurement method and device, storage medium and electronic equipment - Google Patents

Depth measurement method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN116051616A
CN116051616A CN202111261392.3A CN202111261392A CN116051616A CN 116051616 A CN116051616 A CN 116051616A CN 202111261392 A CN202111261392 A CN 202111261392A CN 116051616 A CN116051616 A CN 116051616A
Authority
CN
China
Prior art keywords
point
depth
ranging sensor
plane
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111261392.3A
Other languages
Chinese (zh)
Other versions
CN116051616B (en
Inventor
胡佳欣
郎小明
臧波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sankuai Online Technology Co Ltd
Original Assignee
Beijing Sankuai Online Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sankuai Online Technology Co Ltd filed Critical Beijing Sankuai Online Technology Co Ltd
Priority to CN202111261392.3A priority Critical patent/CN116051616B/en
Publication of CN116051616A publication Critical patent/CN116051616A/en
Application granted granted Critical
Publication of CN116051616B publication Critical patent/CN116051616B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The description provides a depth measurement method, after the position of a feature point in a depth image and the position of a point detected by a single-point ranging sensor are obtained, a measurement result of the single-point ranging sensor is not directly used as the depth of the feature point to be assigned, a plurality of planes are established by utilizing the feature point in a preset range of the point detected by the single-point ranging sensor, a to-be-determined feature point on the plane is found by taking the plane where the position of the point detected by the single-point ranging sensor is located as a reference, and the actual depth of the to-be-determined feature point is determined by taking the position of the to-be-determined feature point on the plane where the position of the point detected by the single-point ranging sensor is located as an optimization target according to the relation between the point and the plane. According to the measuring method provided by the specification, errors possibly occurring in the measuring process are considered, the influence of errors caused by the measuring process on the result is effectively reduced, and the accuracy of the depth assignment of the feature points in the depth image is improved.

Description

Depth measurement method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of unmanned technologies, and in particular, to a depth measurement method, a depth measurement device, a storage medium, and an electronic device.
Background
At present, unmanned equipment has very excellent application prospect in various fields, wherein the development of unmanned technology is most prominent. The unmanned equipment needs to realize the obstacle avoidance function in the automatic driving process, so that a single-point ranging sensor (such as ultrasonic ranging, laser ranging, infrared ranging and the like) is usually arranged to sense surrounding obstacles, the single-point ranging sensor is relatively low in price and has good ranging range and accuracy, and the unmanned equipment has a wider application scene.
Nowadays, the combined use of multiple sensors is a trend of development of unmanned equipment, and the assignment of feature points in a depth image acquired by a camera by using a measurement result of a single-point ranging sensor is one of more common technologies. And the camera receives the data detected by the single-point ranging sensor, assigns the depth of the characteristic points in the depth image, and performs subsequent processing. However, such a method of assigning a depth to a feature point in a depth image using data detected by a single-point ranging sensor generally generates a large error.
In the prior art, when assigning a depth to a feature point in a depth image, a single-point ranging sensor is usually used to directly assign a depth measurement value of the feature point as the depth of the feature point, or an average value of depth measurement values of surrounding points of the feature point in the depth image is assigned as the depth of the feature point. However, in the actual measurement process of the single-point ranging sensor, the point detected by the single-point ranging sensor is likely not the target point but the point around the target point, so that the measurement result of the single-point ranging sensor is not very accurate and is prone to error. The method for assigning values directly depending on the measurement result of the single-point ranging sensor does not consider the influence of errors generated in the measurement process on the result, so that the depth values of the feature points on the depth image obtained by using the method are often inaccurate.
The specification provides a depth measurement method, which can effectively reduce errors and solve the problem of inaccuracy when a measurement result of a single-point ranging sensor is used for assigning depth to a feature point in a depth image of a camera.
Disclosure of Invention
The present disclosure provides a depth measurement method, a depth measurement device, a storage medium, and an electronic apparatus, so as to partially solve the above-mentioned problems in the prior art.
The technical scheme adopted in the specification is as follows:
the specification provides a depth measurement method, comprising:
acquiring a depth image and the position of a point detected by a single-point ranging sensor in the depth image;
according to the position of the point detected by the single-point ranging sensor in the depth image, determining characteristic points in a position preset range of the point detected by the single-point ranging sensor in the depth image;
determining the estimated position of each feature point according to the estimated depth of each feature point, and determining a standard plane according to the estimated position of each feature point;
selecting a feature point to be determined from the feature points according to the standard plane;
and aiming at each undetermined characteristic point, taking the position of the undetermined characteristic point on the standard plane as an optimization target, and adjusting the estimated depth of the undetermined characteristic point to determine the actual depth of the undetermined characteristic point.
Optionally, determining the estimated position of each feature point according to the estimated depth of each feature point, and determining the standard plane according to the estimated position of each feature point specifically includes:
determining a plurality of triangles by taking each characteristic point as a vertex, wherein each characteristic point is at least one vertex of a triangle;
determining a triangle where the point detected by the single-point ranging sensor is located according to the position of the point detected by the single-point ranging sensor in the depth image, and taking a plane corresponding to the triangle with the nearest sum of the distances from the vertex in the triangle where the point detected by the single-point ranging sensor is located to the point detected by the single-point ranging sensor as a central plane;
and determining a standard plane according to the central plane and the triangles.
Optionally, determining a standard plane according to the central plane and the triangles specifically includes:
aiming at the plane corresponding to each triangle, taking the plane as a plane to be combined;
if the included angle between the plane to be combined and the central plane is not larger than a first specified threshold value and the distance between the position of the point detected by the single-point ranging sensor and the plane to be combined is not larger than a second specified threshold value, the central plane is re-determined according to the central plane and the plane to be combined until the specified condition is met, and the finally determined central plane is taken as a standard plane.
Optionally, for each undetermined feature point, adjusting the estimated depth of the undetermined feature point by taking the position of the undetermined feature point on the standard plane as an optimization target, so as to determine the actual depth of the undetermined feature point, which specifically includes:
for each undetermined characteristic point, determining a displacement vector between the undetermined characteristic point and the point detected by the single-point ranging sensor according to the position of the undetermined characteristic point and the position of the point detected by the single-point ranging sensor;
determining a related term according to the displacement vector and a normal vector of the standard plane;
and according to the related item, adjusting the estimated depth of the undetermined feature point to determine the actual depth of the undetermined feature point.
Optionally, adjusting the estimated depth of the undetermined feature point according to the related item to determine the actual depth of the undetermined feature point, which specifically includes:
determining an error of the depth of the point detected by the single-point ranging sensor according to the depth of the point detected by the single-point ranging sensor;
determining an error of the related term according to the error of the depth of the point detected by the single-point ranging sensor;
and adjusting the estimated depth of the undetermined feature point according to the error of the related item so as to determine the actual depth of the undetermined feature point.
Optionally, determining the error of the related term according to the error of the depth of the point detected by the single-point ranging sensor specifically includes:
and determining the error of the related term according to the error of the depth of the point detected by the single-point ranging sensor and the normal vector of the standard plane.
Optionally, adjusting the estimated depth of the undetermined feature point according to the error of the related item to determine the actual depth of the undetermined feature point, which specifically includes:
and taking the error of the related term as the error of observed quantity, and adopting extended Kalman filtering to determine the actual depth of the undetermined characteristic point.
An apparatus for searching provided in the present specification, the apparatus comprising:
the acquisition module acquires a depth image and the position of a point detected by the single-point ranging sensor in the depth image;
the first determining module is used for determining characteristic points in a position preset range of the point detected by the single-point ranging sensor in the depth image according to the position of the point detected by the single-point ranging sensor in the depth image;
the second determining module is used for determining the estimated position of each characteristic point according to the estimated depth of each characteristic point and determining a standard plane according to the estimated position of each characteristic point;
The selection module is used for selecting undetermined characteristic points from the characteristic points according to the standard plane;
and the depth determining module is used for adjusting the estimated depth of each undetermined characteristic point by taking the position of the point detected by the single-point ranging sensor on the standard plane as an optimization target so as to determine the actual depth of the undetermined characteristic point.
The present specification provides a computer readable storage medium storing a computer program which when executed by a processor implements the depth measurement method described above.
The present specification provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the depth measurement method described above when executing the program.
The above-mentioned at least one technical scheme that this specification adopted can reach following beneficial effect:
the description provides a depth measurement method, after the position of a feature point in a depth image and the position of a point detected by a single-point ranging sensor are obtained, a measurement result of the single-point ranging sensor is not directly used as the depth of the feature point to be assigned, a plurality of planes are established by utilizing the feature point in a preset range of the point detected by the single-point ranging sensor, a to-be-determined feature point on the plane is found by taking the plane where the position of the point detected by the single-point ranging sensor is located as a reference, and the actual depth of the to-be-determined feature point is determined by taking the position of the to-be-determined feature point on the plane where the position of the point detected by the single-point ranging sensor is located as an optimization target according to the relation between the point and the plane. According to the measuring method provided by the specification, errors possibly occurring in the measuring process are considered, the influence of errors caused by the measuring process on the result is effectively reduced, and the accuracy of the depth assignment of the feature points in the depth image is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the specification, illustrate and explain the exemplary embodiments of the present specification and their description, are not intended to limit the specification unduly. In the drawings:
FIG. 1 is a schematic flow chart of a depth measurement method in the present specification;
FIG. 2 is a schematic diagram of a triangulation method for feature points in the present specification;
FIG. 3 is a schematic view of a depth measurement device provided herein;
fig. 4 is a schematic view of the electronic device corresponding to fig. 1 provided in the present specification.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the present specification more apparent, the technical solutions of the present specification will be clearly and completely described below with reference to specific embodiments of the present specification and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present specification. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are intended to be within the scope of the present application based on the embodiments herein.
Since the concept of synchronous positioning and mapping (Simultaneous Localization And Mapping, SLAM) has been proposed, extensive research and attention has been paid, and Visual-Inertial Odometry (VIO) is one of the most common techniques in SLAM. The VIO utilizes the data obtained by the camera and the data obtained by the inertial measurement unit (Inertial Measurement Unit, IMU) simultaneously, and combines the advantages of the camera and the IMU, so that good positioning and mapping effects are obtained.
The technology adopted by the VIO camera in positioning the VIO camera is a visual positioning technology which is completed by depending on the matching of characteristic points. In the depth image acquired by the camera, there are a large number of feature points with high recognition, such as points with large gray value variation or large curvature, which are easily recognized. In the process of vehicle running, partial same characteristic points exist in two adjacent frames of images acquired by a camera, and the depth variation of the same characteristic point between the two adjacent frames of images is a main reference quantity of the technology in positioning. Therefore, whether the depth of the feature point on the depth image obtained by the camera is accurate directly affects whether the estimation of the position of the camera is accurate.
Currently, the most widely used method is to use the measurement result of a single-point ranging sensor to assign the depth of a feature point on a depth image obtained by a camera. In the prior art, after a measurement result of a single-point ranging sensor corresponding to a feature point in a depth image is obtained, the measurement result of the single-point ranging sensor is generally directly used as a depth value of the feature point. However, in the actual measurement process of any target point by the single-point ranging sensor, it cannot be guaranteed that the point detected by the single-point ranging sensor is necessarily the target point, in fact, in most cases, the point detected by the single-point ranging sensor is a certain point around the target point, and although the point is very close to the target point, the point is not located on the target point, so that an error is necessarily generated in the measurement process.
Obviously, in the prior art, the method of directly taking the measurement result of the single-point ranging sensor corresponding to the feature point in the depth image as the depth of the measurement point does not consider the influence caused by errors generated during measurement, so that the depth accuracy of the feature point obtained by the method adopted in the prior art is lower. The depth measurement method provided by the specification considers errors possibly generated during measurement, and can effectively improve the accuracy of depth assignment of feature points in the depth image.
The following describes in detail the technical solutions provided by the embodiments of the present specification with reference to the accompanying drawings.
Fig. 1 is a schematic flow chart of a depth measurement method in the present specification, which specifically includes the following steps:
s100: a depth image and a position of a point detected by a single point ranging sensor in the depth image are acquired.
When acquiring information of the single-point ranging sensor and the feature point, a global or local map of a specified area may be established using a VIO-based SLAM technique, the map including a depth image of the specified area acquired by a camera in the VIO. In general, a depth image in a specified area acquired by a camera in a VIO includes a plurality of feature points, and each feature point may estimate a depth initial value of the feature point by a plurality of methods, for example, triangulation, phase resolution, and the like. However, these methods, which are not actually measured, are often not accurate enough to estimate the approximate location, and therefore require the depth of the feature point to be redetermined using the measurement results of the single-point ranging sensor to obtain more accurate results. The location of the point detected by the single point ranging sensor in the depth image is determined for processing in a subsequent step.
It should be noted that the device for ranging in this specification may be any ranging sensor, such as radar, ultrasonic ranging sensor, etc., and the single-point ranging sensor has the most wide application, so the single-point ranging sensor is exemplified in this specification.
The depth measurement method provided in the present specification may be performed by any unmanned apparatus, and the unmanned apparatus referred to in the present specification may refer to an apparatus capable of realizing automatic driving such as an unmanned vehicle, a robot, an automatic distribution apparatus, or the like. Based on the above, the unmanned device applying the depth measurement method provided in the present specification can be used for performing a delivery task in the delivery field, for example, a business scenario of delivery such as express delivery, logistics, takeaway and the like using the unmanned device.
S102: and determining characteristic points in a position preset range of the point detected by the single-point ranging sensor in the depth image according to the position of the point detected by the single-point ranging sensor in the depth image.
The number of the characteristic points in the depth image is generally large, and the characteristic points which are in a preset range from the position of the point detected by the single-point ranging sensor in the depth image are used as the characteristic points which possibly participate in the assignment. It should be noted that, the position of the point detected by the single-point ranging sensor in the depth image mentioned in this step refers to the pixel coordinates of the point detected by the single-point ranging sensor in the depth image, and the preset range also refers to the preset range in the depth image. Specifically, a circle range is defined in the depth image by taking the position of the point detected by the single-point ranging sensor in the depth image as the center and taking the designated length as the radius, wherein the circle range is a preset range, and the designated length can be set according to specific requirements.
S104: and determining the estimated position of each feature point according to the estimated depth of each feature point, and determining a standard plane according to the estimated position of each feature point.
In the depth image, the information included in each feature point generally includes the position of the feature point in the depth image and the estimated depth of the feature point, where the position of the feature point in the depth image is generally represented by two-dimensional coordinates, i.e., pixel coordinates, and the estimated depth is the initial depth value that can be obtained by various methods and is mentioned in step S100. For each feature point on the depth image, according to the estimated depth of the feature point, the estimated position of the feature point in the world coordinate system can be obtained by utilizing the conversion relation between the image coordinate system and the world coordinate system in the camera, and the estimated position is a three-dimensional coordinate. It should be noted that, the origin of the world coordinate system may be set to any position or object according to different requirements, for example, a camera, a single-point ranging sensor, an unmanned device itself, and the like, which is not limited in this specification.
With a large number of points in space, a number of different planes can be divided in space. In order to update the depth of the feature point using the measurement result of the single-point ranging sensor, a plane in which the point detected by the single-point ranging sensor is located may be determined among a plurality of different planes divided, and a standard plane may be selected from the planes in which the point detected by the single-point ranging sensor is located. According to different rules and conditions, different division modes can be selected, different conditions can be divided from the same space, and different standard planes are determined. For example, a plane is determined by including a specified number of feature points, or a plane is determined by composing a specific shape with feature points, or the like.
In particular, the partitioning may be performed in the form of triangulation. As shown in fig. 2, a plurality of triangles are determined by taking each characteristic point as a vertex, and each characteristic point is at least the vertex of one triangle; and determining a triangle where the point detected by the single-point ranging sensor is located according to the position of the point detected by the single-point ranging sensor in the depth image, and taking a plane corresponding to the triangle with the nearest sum of the distances from the vertex in the triangle where the point detected by the single-point ranging sensor is located to the point detected by the single-point ranging sensor as a central plane.
And determining a plurality of triangles by taking the characteristic points as vertexes according to a rule of determining a plane by three points, wherein each triangle can be positioned on an independent plane. According to the position of the point detected by the single-point ranging sensor in the world coordinate system, a triangle containing the point detected by the single-point ranging sensor can be determined, and among the triangles containing the point detected by the single-point ranging sensor, the triangle with the nearest sum of the distances from the three vertexes to the point detected by the single-point ranging sensor, namely the triangle closest to the center, is selected, and the plane in which the triangle is located is taken as the center plane.
In the process of dividing a plurality of planes, the situation that the planes in which some triangles are located are the same plane may occur, and because each plane is obtained according to the estimated position of the feature point and is not very accurate, when the included angle and the distance between two planes are very small, the two planes can be approximately considered to be the same plane, and are combined. Specifically, for each plane corresponding to the triangle adjacent to the triangle corresponding to the central plane, the plane may be used as the plane to be combined; if the included angle between the plane to be combined and the central plane is not larger than a first specified threshold value and the distance between the position of the point detected by the single-point ranging sensor and the plane to be combined is not larger than a second specified threshold value, the central plane is re-determined according to the central plane and the plane to be combined until the specified condition is met, and the finally determined central plane is taken as a standard plane. The first specified threshold and the second specified threshold can be preset according to requirements.
Wherein, the triangle adjacent to the triangle corresponding to the center plane means that at least one vertex of the triangle corresponding to the triangle and the center plane coincides and/or the triangle corresponding to the triangle and the center plane has a common side; when the central plane is redetermined, the plane to be combined is regarded as the extension of the central plane and is combined into the central plane; the specified condition is that, for the current center plane, there is no included angle formed by the plane to be combined and the current center plane is not greater than a first specified threshold value, and the distance from the position of the point detected by the single-point ranging sensor to the plane to be combined is not greater than a second specified threshold value.
S106: selecting a feature point to be determined from the feature points according to the standard plane;
in the measurement method provided in the present specification, the depth of the feature point needs to be updated according to the measurement result of the single-point ranging sensor, so that the point detected by the single-point ranging sensor and the feature point need to be associated together through a specific relationship, and the specific relationship is the standard plane determined in step S104. And taking all the feature points on the standard plane as undetermined feature points, and updating the depth of the undetermined feature points by using the constraint relation between the points and the plane in the subsequent step.
S108: and aiming at each undetermined characteristic point, taking the position of the undetermined characteristic point on the standard plane as an optimization target, and adjusting the estimated depth of the undetermined characteristic point to determine the actual depth of the undetermined characteristic point.
In updating the depth of the feature points, the depth of the feature points may be recorded using a state variable X,
Figure BDA0003325864410000101
wherein (1)>
Figure BDA0003325864410000102
Representing the pose of the single point ranging sensor under the reference frame,/->
Figure BDA0003325864410000103
Indicating the position of the single point ranging sensor under the reference frame,/-, for example>
Figure BDA0003325864410000104
Representing the pose of the single point ranging sensor under the current frame,/- >
Figure BDA0003325864410000105
Represents the position of the single-point ranging sensor lambda under the current frame 1 ,λ 2 ,λ 3 … … the parameterization of each feature point is various, and can be set according to the need, for example, lambda 1 ,λ 2 ,λ 3 … … may be used to represent the position of each feature point under the reference frame. Wherein, for any one characteristic point,the current frame refers to the current time, and the reference frame refers to the time corresponding to the feature point when the feature point is observed by the camera for the first time; each feature point may correspond to a different reference frame, and thus there may be multiple reference frames.
As described in step S106, the undetermined feature point is a feature point taken out of the standard plane, and thus in an ideal case, the undetermined feature point must be on the standard plane. In practice, however, each plane is determined by the estimated position of the feature point, which is not necessarily accurate, and the plane to be combined and the center plane are only similar, but not completely identical, so that there is necessarily an error. And aiming at each undetermined characteristic point, taking the position of the undetermined characteristic point on the standard plane as an optimization target, and adjusting the estimated depth of the undetermined characteristic point to be used as a process for eliminating the error.
Specifically, for each undetermined feature point, determining a displacement vector between the undetermined feature point and the point detected by the single-point ranging sensor according to the position of the undetermined feature point and the position of the point detected by the single-point ranging sensor; calculating the inner product of the displacement vector and the normal vector of the standard plane as a related term; according to the related item, the estimated depth of the undetermined feature point is adjusted to determine the actual depth of the undetermined feature point, and the specific formula is as follows:
Figure BDA0003325864410000106
wherein r represents a related term,
Figure BDA0003325864410000107
normal vector representing standard plane, p representing estimated position of feature point, d representing depth of point detected by single-point ranging sensor, +.>
Figure BDA0003325864410000111
A unit direction vector representing a point detected by the single-point ranging sensor; />
Figure BDA0003325864410000112
Represents d and->
Figure BDA0003325864410000113
The result of the multiplication is the position of the point detected by the single-point ranging sensor.
Normal vector of standard plane
Figure BDA0003325864410000114
The position of three different points can be determined in various ways, for example, the positions of three different points can be determined arbitrarily on the standard plane, and the normal vector of the standard plane can be determined according to the positions of the three points, and the specific formula is as follows:
Figure BDA0003325864410000115
wherein,,
Figure BDA0003325864410000116
normal vector representing standard plane, p 1 、p 2 、p 3 Respectively representing the positions of any three different points on the standard plane. The difference between the positions of any two points is the vector between the two points, and the vector obtained by cross multiplying any two vectors is necessarily perpendicular to the two vectors, and therefore is also necessarily perpendicular to the plane in which the two vectors are located, namely the normal vector of the plane.
In an ideal case, the displacement vectors must lie in a standard plane, and the inner product of any vector in the standard plane and the normal vector in that plane must be zero. In practical cases, however, due to the existence of errors, the position of the displacement vector will deviate, and the inner product result is not zero. In this case, the optimization target may be converted from the position of the undetermined feature point on the standard plane to the inner product of the displacement vector and the normal vector of the standard plane, and the depth of the undetermined feature point is continuously updated in the optimization process, so as to obtain the actual depth of each undetermined feature point.
According to the depth measurement method provided by the specification, the depth of the characteristic points in the depth image is not directly assigned by using the measurement result of the single-point ranging sensor, but an observation model is built according to the steps on the basis of the detection result of the single-point ranging sensor. Dividing the space where the feature points are located into a plurality of planes by triangulation, finding the plane where the points detected by the single-point ranging sensor are located, and continuously adjusting the depth of each undetermined feature point by using the constraint relation between the points and the plane so as to obtain the actual depth of each undetermined feature point. The depth measurement method provided by the specification avoids errors generated in the measurement process of the single-point ranging sensor, and improves the accuracy of the assignment of the characteristic points in the depth image to a certain extent.
In addition, in order to further improve accuracy in assigning depths to feature points in a depth image, the method provided by the specification can also adjust the estimated depths of the feature points to be determined by adopting an extended Kalman filtering method so as to determine the actual depths of the feature points to be determined. Determining an error of the depth of the point detected by the single-point ranging sensor according to the depth of the point detected by the single-point ranging sensor; determining the error of the related item according to the error of the depth of the point detected by the single-point ranging sensor and the normal vector of the standard plane; and adjusting the estimated depth of the undetermined feature point according to the error of the related item so as to determine the actual depth of the undetermined feature point.
Specifically, the error of the detection result of the single-point ranging sensor may be set according to the requirement, for example, 1% of the depth of the point detected by the single-point ranging sensor may be used as the error of the detection result of the single-point ranging sensor. The formula given in step S108 expresses that the correlation term r is related to the depth d of the point detected by the single-point ranging sensor, and thus the error of the correlation term r can also be expressed by the error of the detection result of the single-point ranging sensor, as follows:
Figure BDA0003325864410000121
Wherein R is 1×1 Representing errors of related items, S 3×3 An error representing the detection result of the single-point ranging sensor,
Figure BDA0003325864410000122
normal vector representing standard plane, +.>
Figure BDA0003325864410000123
Representing the transpose of the normal vector of the standard plane. Because the single-point ranging sensor detection result is acquired in the three-dimensional space, the error S of the single-point ranging sensor detection result 3×3 Is a three-dimensional matrix, which contains the respective errors of each dimension.
R is R 1×1 And (3) taking the error as an observed quantity to enter an extended Kalman filter, and updating the estimated depth of the characteristic point by using the extended Kalman filter to determine the actual depth of the characteristic point. Under practical application, in the process of driving unmanned equipment, a camera and a single-point ranging sensor usually acquire multiple data at the same position. Therefore, the depth of the characteristic points in each depth image can be measured according to the method according to the sequence of obtaining the depth images by the camera; for each feature point, if the feature point appears for the first time, taking the depth initial value in the step S100 as the estimated depth of the feature point, and updating the estimated depth; if the feature point appears for the second time or more, the updated estimated depth of the feature point is used as the estimated depth of the feature point when the feature point appears last time, and the estimated depth is updated. Those skilled in the art will appreciate that in kalman filtering, the more times the result is updated, the more the result converges to a true value. Therefore, the method can further improve the accuracy of the feature point depth assignment in the depth image.
The depth measuring method provided by the present specification is based on the same thought, and the present specification also provides a corresponding depth measuring device, as shown in fig. 3.
Fig. 3 is a schematic diagram of a depth measurement device provided in the present specification, specifically including:
the acquisition module 200 acquires a depth image and the position of a point detected by a single-point ranging sensor in the depth image;
a first determining module 202, configured to determine feature points within a preset range of positions of points detected by the single-point ranging sensor in the depth image according to positions of points detected by the single-point ranging sensor in the depth image;
the second determining module 204 determines the estimated position of each feature point according to the estimated depth of each feature point, and determines the standard plane according to the estimated position of each feature point;
a selection module 206, for selecting a feature point to be determined from the feature points according to the standard plane;
the depth determining module 208 adjusts, for each pending feature point, the estimated depth of the pending feature point with the position of the point detected by the single-point ranging sensor on the standard plane as an optimization target, so as to determine the actual depth of the pending feature point.
In an alternative embodiment:
the second determining module 204 is specifically configured to determine a plurality of triangles by using each feature point as a vertex, where each feature point is at least a vertex of one triangle; determining a triangle where the point detected by the single-point ranging sensor is located according to the position of the point detected by the single-point ranging sensor in the depth image, and taking a plane corresponding to the triangle with the nearest sum of the distances from the vertex in the triangle where the point detected by the single-point ranging sensor is located to the point detected by the single-point ranging sensor as a central plane; and determining a standard plane according to the central plane and the triangles.
In an alternative embodiment:
the second determining module 204 is specifically configured to take, for a plane corresponding to each triangle, the plane as a plane to be combined; if the included angle between the plane to be combined and the central plane is not larger than a first specified threshold value and the distance between the position of the point detected by the single-point ranging sensor and the plane to be combined is not larger than a second specified threshold value, the central plane is re-determined according to the central plane and the plane to be combined until the specified condition is met, and the finally determined central plane is taken as a standard plane.
In an alternative embodiment:
the depth determining module 208 is specifically configured to determine, for each pending feature point, a displacement vector between the pending feature point and a point detected by the single-point ranging sensor according to a position of the pending feature point and a position of the point detected by the single-point ranging sensor; determining a related term according to the displacement vector and a normal vector of the standard plane; and according to the related item, adjusting the estimated depth of the undetermined feature point to determine the actual depth of the undetermined feature point.
In an alternative embodiment:
the depth determining module 208 is specifically configured to determine an error of the depth of the point detected by the single-point ranging sensor according to the depth of the point detected by the single-point ranging sensor; determining an error of the related term according to the error of the depth of the point detected by the single-point ranging sensor; and adjusting the estimated depth of the undetermined feature point according to the error of the related item so as to determine the actual depth of the undetermined feature point.
In an alternative embodiment:
the depth determining module 208 is specifically configured to determine the error of the related term according to the error of the depth of the point detected by the single-point ranging sensor and the normal vector of the standard plane.
In an alternative embodiment:
the depth determining module 208 is specifically configured to determine, as the observed error, the actual depth of the undetermined feature point by using extended kalman filtering with the error of the related term.
The present specification also provides a computer readable storage medium storing a computer program operable to perform the depth measurement method provided in fig. 1 above.
The present specification also provides a schematic structural diagram of the electronic device shown in fig. 4. At the hardware level, the unmanned device includes a processor, an internal bus, a network interface, memory, and non-volatile storage, as described in fig. 4, although other hardware required by the business is possible. The processor reads the corresponding computer program from the non-volatile memory into the memory and then runs to implement the depth measurement method described above with respect to fig. 1. Of course, other implementations, such as logic devices or combinations of hardware and software, are not excluded from the present description, that is, the execution subject of the following processing flows is not limited to each logic unit, but may be hardware or logic devices.
In the 90 s of the 20 th century, improvements to one technology could clearly be distinguished as improvements in hardware (e.g., improvements to circuit structures such as diodes, transistors, switches, etc.) or software (improvements to the process flow). However, with the development of technology, many improvements of the current method flows can be regarded as direct improvements of hardware circuit structures. Designers almost always obtain corresponding hardware circuit structures by programming improved method flows into hardware circuits. Therefore, an improvement of a method flow cannot be said to be realized by a hardware entity module. For example, a programmable logic device (Programmable Logic Device, PLD) (e.g., field programmable gate array (Field Programmable Gate Array, FPGA)) is an integrated circuit whose logic function is determined by the programming of the device by a user. A designer programs to "integrate" a digital system onto a PLD without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Moreover, nowadays, instead of manually manufacturing integrated circuit chips, such programming is mostly implemented by using "logic compiler" software, which is similar to the software compiler used in program development and writing, and the original code before the compiling is also written in a specific programming language, which is called hardware description language (Hardware Description Language, HDL), but not just one of the hdds, but a plurality of kinds, such as ABEL (Advanced Boolean Expression Language), AHDL (Altera Hardware Description Language), confluence, CUPL (Cornell University Programming Language), HDCal, JHDL (Java Hardware Description Language), lava, lola, myHDL, PALASM, RHDL (Ruby Hardware Description Language), etc., VHDL (Very-High-Speed Integrated Circuit Hardware Description Language) and Verilog are currently most commonly used. It will also be apparent to those skilled in the art that a hardware circuit implementing the logic method flow can be readily obtained by merely slightly programming the method flow into an integrated circuit using several of the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer readable medium storing computer readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, application specific integrated circuits (Application Specific Integrated Circuit, ASIC), programmable logic controllers, and embedded microcontrollers, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, atmel AT91SAM, microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic of the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller in a pure computer readable program code, it is well possible to implement the same functionality by logically programming the method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc. Such a controller may thus be regarded as a kind of hardware component, and means for performing various functions included therein may also be regarded as structures within the hardware component. Or even means for achieving the various functions may be regarded as either software modules implementing the methods or structures within hardware components.
The system, apparatus, module or unit set forth in the above embodiments may be implemented in particular by a computer chip or entity, or by a product having a certain function. One typical implementation is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being functionally divided into various units, respectively. Of course, the functions of each element may be implemented in one or more software and/or hardware elements when implemented in the present specification.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
It will be appreciated by those skilled in the art that embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the present specification may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present description can take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, as relevant to see a section of the description of method embodiments.
The foregoing is merely exemplary of the present disclosure and is not intended to limit the disclosure. Various modifications and alterations to this specification will become apparent to those skilled in the art. Any modifications, equivalent substitutions, improvements, or the like, which are within the spirit and principles of the present description, are intended to be included within the scope of the claims of the present application.

Claims (10)

1. A depth measurement method, comprising:
acquiring a depth image and the position of a point detected by a single-point ranging sensor in the depth image;
according to the position of the point detected by the single-point ranging sensor in the depth image, determining characteristic points in a position preset range of the point detected by the single-point ranging sensor in the depth image;
Determining the estimated position of each feature point according to the estimated depth of each feature point, and determining a standard plane according to the estimated position of each feature point;
selecting a feature point to be determined from the feature points according to the standard plane;
and aiming at each undetermined characteristic point, taking the position of the undetermined characteristic point on the standard plane as an optimization target, and adjusting the estimated depth of the undetermined characteristic point to determine the actual depth of the undetermined characteristic point.
2. The method of claim 1, wherein determining the estimated position of each feature point based on the estimated depth of each feature point, and determining the standard plane based on the estimated position of each feature point, comprises:
determining a plurality of triangles by taking each characteristic point as a vertex, wherein each characteristic point is at least one vertex of a triangle;
determining a triangle where the point detected by the single-point ranging sensor is located according to the position of the point detected by the single-point ranging sensor in the depth image, and taking a plane corresponding to the triangle with the nearest sum of the distances from the vertex in the triangle where the point detected by the single-point ranging sensor is located to the point detected by the single-point ranging sensor as a central plane;
And determining a standard plane according to the central plane and the triangles.
3. The method according to claim 2, wherein determining a standard plane from the central plane and the number of triangles, in particular comprises:
aiming at the plane corresponding to each triangle, taking the plane as a plane to be combined;
if the included angle between the plane to be combined and the central plane is not larger than a first specified threshold value and the distance between the position of the point detected by the single-point ranging sensor and the plane to be combined is not larger than a second specified threshold value, the central plane is re-determined according to the central plane and the plane to be combined until the specified condition is met, and the finally determined central plane is taken as a standard plane.
4. The method of claim 1, wherein for each pending feature point, adjusting the estimated depth of the pending feature point with the location of the pending feature point on the standard plane as an optimization target to determine the actual depth of the pending feature point, specifically comprises:
for each undetermined characteristic point, determining a displacement vector between the undetermined characteristic point and the point detected by the single-point ranging sensor according to the position of the undetermined characteristic point and the position of the point detected by the single-point ranging sensor;
Determining a related term according to the displacement vector and a normal vector of the standard plane;
and according to the related item, adjusting the estimated depth of the undetermined feature point to determine the actual depth of the undetermined feature point.
5. The method of claim 4, wherein adjusting the estimated depth of the pending feature point based on the correlation term to determine the actual depth of the pending feature point comprises:
determining an error of the depth of the point detected by the single-point ranging sensor according to the depth of the point detected by the single-point ranging sensor;
determining an error of the related term according to the error of the depth of the point detected by the single-point ranging sensor;
and adjusting the estimated depth of the undetermined feature point according to the error of the related item so as to determine the actual depth of the undetermined feature point.
6. The method of claim 5, wherein determining the error of the correlation term based on the error of the depth of the point detected by the single point ranging sensor, comprises:
and determining the error of the related term according to the error of the depth of the point detected by the single-point ranging sensor and the normal vector of the standard plane.
7. The method of claim 5, wherein adjusting the estimated depth of the undetermined feature point to determine the actual depth of the undetermined feature point based on the error of the correlation term, specifically comprises:
and taking the error of the related term as the error of observed quantity, and adopting extended Kalman filtering to determine the actual depth of the undetermined characteristic point.
8. A depth measurement device, comprising:
the acquisition module acquires a depth image and the position of a point detected by the single-point ranging sensor in the depth image;
the first determining module is used for determining characteristic points in a position preset range of the point detected by the single-point ranging sensor in the depth image according to the position of the point detected by the single-point ranging sensor in the depth image;
the second determining module is used for determining the estimated position of each characteristic point according to the estimated depth of each characteristic point and determining a standard plane according to the estimated position of each characteristic point;
the selection module is used for selecting undetermined characteristic points from the characteristic points according to the standard plane;
and the depth determining module is used for adjusting the estimated depth of each undetermined characteristic point by taking the position of the point detected by the single-point ranging sensor on the standard plane as an optimization target so as to determine the actual depth of the undetermined characteristic point.
9. A computer-readable storage medium, characterized in that the storage medium stores a computer program which, when executed by a processor, implements the method of any of the preceding claims 1-7.
10. An unmanned device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any of the preceding claims 1-7 when the program is executed by the processor.
CN202111261392.3A 2021-10-28 2021-10-28 Depth measurement method and device, storage medium and electronic equipment Active CN116051616B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111261392.3A CN116051616B (en) 2021-10-28 2021-10-28 Depth measurement method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111261392.3A CN116051616B (en) 2021-10-28 2021-10-28 Depth measurement method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN116051616A true CN116051616A (en) 2023-05-02
CN116051616B CN116051616B (en) 2024-07-23

Family

ID=86126040

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111261392.3A Active CN116051616B (en) 2021-10-28 2021-10-28 Depth measurement method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN116051616B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106052674A (en) * 2016-05-20 2016-10-26 青岛克路德机器人有限公司 Indoor robot SLAM method and system
GB201801399D0 (en) * 2017-12-13 2018-03-14 Xihua Univeristy Positioning method and apparatus
CN110766024A (en) * 2019-10-08 2020-02-07 湖北工业大学 Visual odometer feature point extraction method based on deep learning and visual odometer
CN111426299A (en) * 2020-06-15 2020-07-17 北京三快在线科技有限公司 Method and device for ranging based on depth of field of target object
CN112102386A (en) * 2019-01-22 2020-12-18 Oppo广东移动通信有限公司 Image processing method, image processing device, electronic equipment and computer readable storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106052674A (en) * 2016-05-20 2016-10-26 青岛克路德机器人有限公司 Indoor robot SLAM method and system
GB201801399D0 (en) * 2017-12-13 2018-03-14 Xihua Univeristy Positioning method and apparatus
CN112102386A (en) * 2019-01-22 2020-12-18 Oppo广东移动通信有限公司 Image processing method, image processing device, electronic equipment and computer readable storage medium
CN110766024A (en) * 2019-10-08 2020-02-07 湖北工业大学 Visual odometer feature point extraction method based on deep learning and visual odometer
CN111426299A (en) * 2020-06-15 2020-07-17 北京三快在线科技有限公司 Method and device for ranging based on depth of field of target object

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
肖宇峰;黄鹤;郑杰;刘冉;: "Kinect与二维激光雷达结合的机器人障碍检测", 电子科技大学学报, no. 03, 30 May 2018 (2018-05-30) *

Also Published As

Publication number Publication date
CN116051616B (en) 2024-07-23

Similar Documents

Publication Publication Date Title
US20200011668A1 (en) Simultaneous location and mapping (slam) using dual event cameras
US9386209B2 (en) Method and apparatus for estimating position
US9071829B2 (en) Method and system for fusing data arising from image sensors and from motion or position sensors
EP2385496A1 (en) Extraction of 2D surfaces from a 3D point cloud
CN111797906B (en) Method and device for positioning based on vision and inertial mileage
CN115880685B (en) Three-dimensional target detection method and system based on volntet model
CN116740361B (en) Point cloud segmentation method and device, storage medium and electronic equipment
CN114061586B (en) Method and product for generating navigation path of electronic device
CN114061573B (en) Ground unmanned vehicle formation positioning device and method
CN116309823A (en) Pose determining method, pose determining device, pose determining equipment and storage medium
US20240118419A1 (en) Localization method and apparatus, computer apparatus and computer readable storage medium
CN111798489B (en) Feature point tracking method, device, medium and unmanned equipment
CN116051616B (en) Depth measurement method and device, storage medium and electronic equipment
KR20220146901A (en) Method and Apparatus for Accelerating Simultaneous Localization and Mapping
CN116977446A (en) Multi-camera small target identification and joint positioning method and system
CN116465393A (en) Synchronous positioning and mapping method and device based on area array laser sensor
CN112461258A (en) Parameter correction method and device
CN112393723B (en) Positioning method, positioning device, medium and unmanned equipment
CN112712561A (en) Picture construction method and device, storage medium and electronic equipment
CN113310484A (en) Mobile robot positioning method and system
CN116740114B (en) Object boundary fitting method and device based on convex hull detection
CN116558504B (en) Monocular vision positioning method and device
CN118053153B (en) Point cloud data identification method and device, storage medium and electronic equipment
CN116740197B (en) External parameter calibration method and device, storage medium and electronic equipment
CN116993931A (en) Map construction method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant