CN113807442A - Target shape and course estimation method and system - Google Patents

Target shape and course estimation method and system Download PDF

Info

Publication number
CN113807442A
CN113807442A CN202111097076.7A CN202111097076A CN113807442A CN 113807442 A CN113807442 A CN 113807442A CN 202111097076 A CN202111097076 A CN 202111097076A CN 113807442 A CN113807442 A CN 113807442A
Authority
CN
China
Prior art keywords
point cloud
quadrant
max
rectangular frame
formula
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111097076.7A
Other languages
Chinese (zh)
Other versions
CN113807442B (en
Inventor
谢国涛
毛一鸣
边有钢
秦兆博
秦晓辉
王晓伟
秦洪懋
胡满江
徐彪
汪东升
丁荣军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuxi Institute Of Intelligent Control Hunan University
Original Assignee
Wuxi Institute Of Intelligent Control Hunan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuxi Institute Of Intelligent Control Hunan University filed Critical Wuxi Institute Of Intelligent Control Hunan University
Priority to CN202111097076.7A priority Critical patent/CN113807442B/en
Publication of CN113807442A publication Critical patent/CN113807442A/en
Application granted granted Critical
Publication of CN113807442B publication Critical patent/CN113807442B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Geometry (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention discloses a method and a system for estimating a target shape and a course, wherein the method comprises the following steps: step S1, acquiring original target point cloud data and clustering; step S2, describing the clustered point cloud data through a three-dimensional bounding box; step S3, rotating the x and y coordinates of the point cloud along a set direction by a preset step length; step S4, establishing a two-dimensional rectangular coordinate system by taking the center of a plane rectangular frame of the xoy plane corresponding to the three-dimensional bounding box as an origin, and calculating an objective function value according to the quadrant of the point cloud feature points and the shape presented by point cloud distribution in the clustered point cloud data; step S5, judging whether the current course angle is in the set traversal angle range, if yes, returning to step S3; otherwise, go to step S6; step S6, selecting the course angle corresponding to the maximum objective function value as the optimal course angle; and step S7, determining the position, size and posture of the optimal three-dimensional boundary frame of the target according to the position of the center of the plane rectangular frame corresponding to the optimal course angle under the laser radar coordinate system.

Description

Target shape and course estimation method and system
Technical Field
The invention relates to the technical field of automatic driving perception, in particular to a method and a system for estimating a target shape and a target course.
Background
The laser radar is one of important sensors of a sensing system, has high-precision and high-resolution active distance and angle measuring capability, processes point cloud data acquired by the laser radar, and can realize the functions of target detection, tracking, identification and the like of the surrounding environment of the intelligent automobile. The target shape estimation is an important link in the target detection step, and a good shape estimation result can provide accurate target information for subsequent tracking and predicting links, so that the environment perception capability of the intelligent automobile is improved. The object shape information includes information such as the length, width, height, position, and heading angle of the object.
In the prior art, only point cloud data corresponding to a visible outline part under the visual angle of a laser radar can be observed, and then a model, such as an L-shaped fitting three-dimensional boundary frame and a key point fitting boundary frame, is established based on the observed point cloud. The number of L-shaped point clouds in the obstacle data formed by the laser point clouds is more. In summary, the existing intelligent automobile target shape estimation method is affected by uncertainty factors such as point cloud sparsity and noise points of the laser radar, so that the accuracy of the size and the course of the three-dimensional bounding box fitted by the L-shaped point cloud is poor.
Disclosure of Invention
It is an object of the present invention to provide a target shape and heading estimation method and system that overcomes or at least mitigates at least one of the above-mentioned disadvantages of the prior art.
In order to achieve the above object, the present invention provides a method for estimating a target shape and a heading, the method comprising:
step S1, acquiring original target point cloud data, and clustering to obtain clustered point cloud data;
step S2, describing the clustered point cloud data through a three-dimensional bounding box;
step S3, rotating the x and y coordinates of each point cloud in the clustered point cloud data along a set direction by a preset step length;
step S4, establishing a two-dimensional rectangular coordinate system xO1y by taking the center O1 of the plane rectangular frame of the xoy plane corresponding to the three-dimensional bounding box as an origin to form four quadrants, and performing point cloud feature point O2 (x) described by formula (2)c、yc) Determining the area of a plane rectangular frame, the distance sum from the point cloud to the plane rectangular frame and the point number extreme value difference of the quadrant, and further calculating a target function value according to the shape presented by point cloud distribution in the clustered point cloud data;
Figure BDA0003269200040000021
in the formula, xriIs the coordinate of x, y of the rotated ith point cloud obtained in step S3riThe coordinates of y of the rotated ith point cloud obtained in step S3, and n is the total number of point clouds in the clustered point cloud data;
step S5, judging whether the current course angle is in the set traversal angle range, if yes, returning to step S3; otherwise, go to step S6; the course angle is an included angle between the orientation of the target and an x axis of a laser radar coordinate system;
step S6, selecting the course angle corresponding to the maximum objective function value as the optimal course angle;
and step S7, determining the position, size and posture of the optimal three-dimensional boundary frame of the target according to the position of the center of the plane rectangular frame corresponding to the optimal course angle under the laser radar coordinate system.
Further, in step S4, the objective function value L is calculated by equation (17) in the case where the point cloud distribution shape is L-shaped, and the objective function value L' is calculated by equation (25) in the case where the point cloud distribution shape is not L-shaped:
L=(nd/S)2/D (17)
L′=1/(S*Dn) (25)
in the formula, ndIs the dot number extreme difference, S is the area of the planar rectangular frame, D and DnThe distances from the point clouds of which the point cloud distribution shapes are L-shaped and non-L-shaped to the plane rectangular frame are respectively equal.
Further, in step S4, ndCalculated by equation (7):
nd=nmax-nmin (7)
in the formula, nmaxNumber of point clouds of quadrant in which the point cloud feature point is located, nminThe point cloud number of the quadrant which is diagonal to the quadrant in which the point cloud feature points are located.
Further, in step S4, S is calculated by equation (6):
S=a*b (6)
a=xmax-xmin (4)
b=ymax-ymin (5)
in the formula, xmaxIs xeiMaximum value of (1), xminIs xriMinimum value of (1), xmaxIs yriMaximum value of (1), yminIs yriMinimum value of (1).
Further, in step S4, DnCalculated by equation (24)To:
Figure BDA0003269200040000031
d′1i=min{|xri-xmax|,|yri-ymax|} (20)
d′2i=min{|xri-xmax|,|yri-ymin|} (21)
d′3i=min{|xri-xmin|,|yri-ymin|} (22)
d′4i=min{|xri-xmin|,|yri-ymax|} (23)
of formula (II) to'1iThe closest distance, d ', of the ith point cloud described by formula (20) to the planar rectangular box at the first quadrant'2iThe closest distance, d ', of the ith point cloud described by formula (21) to the planar rectangular box at the second quadrant'3iThe closest distance, d ', of the ith point cloud described by formula (22) to the planar rectangular box at the third quadrant'4iThe closest distance of the ith point cloud described for equation (23) to the planar rectangular frame at the fourth quadrant.
Further, in step S4, D is calculated by equation (16):
Figure BDA0003269200040000032
in the formula (d)1iThe closest distance of the ith point cloud to the plane rectangular frame in the first quadrant, d2iThe closest distance of the ith point cloud to the planar rectangular frame in the second quadrant, d3iThe closest distance of the ith point cloud to the plane rectangular frame in the third quadrant, d4iIs the closest distance of the ith point cloud to the planar rectangular box in the fourth quadrant.
Further, in the case where O2 is located in the first quadrant, d1i、d2i、d3iAnd d4iObtained by calculation of the following formulae (8) and (9):
Figure BDA0003269200040000033
d3i=k*max{|xri-xmin|,|yri-ymin|} (9)
in the case of O2 being in the second quadrant, d1i、d2i、d3iAnd d4iObtained by calculation of the following formulae (10) and (11):
Figure BDA0003269200040000034
d4i=k*max{|xri-xmin|,|yri-ymax|} (11)
in case of O2 being in the third quadrant, d1i、d2i、d3iAnd d4iObtained by calculation of the following formulae (12) and (13):
Figure BDA0003269200040000041
d1i=k*max{|xri-xmax|,|yri-ymax|} (13)
in case of O2 being in the fourth quadrant, d1i、d2i、d3iAnd d4iObtained by calculation of the following formulae (14) and (15):
Figure BDA0003269200040000042
d2i=k*max{|xri-xmax|,|yri-ymin|} (15)。
the invention also provides a target shape and course estimation system, which comprises:
the data analysis module is used for acquiring original target point cloud data;
the point cloud clustering module is used for clustering the target point cloud data to obtain clustered point cloud data;
a target shape and course estimation module for describing the clustered point cloud data as a three-dimensional bounding box, rotating the x and y coordinates of each point cloud in the clustered point cloud data along a set direction by a preset step length, then establishing a two-dimensional rectangular coordinate system xO1y with the center O1 of the planar rectangular frame of the xoy plane corresponding to the three-dimensional bounding box as the origin to form four quadrants, and then according to the point cloud feature point O2 (x) described by the formula (2)c、yc) Determining the area of a plane rectangular frame, the distance sum between the point cloud and the plane rectangular frame and the point number extreme value difference in the quadrant, further calculating a target function value according to the shape presented by the point cloud distribution in the clustered point cloud data, selecting a course angle corresponding to the maximum target function value as an optimal course angle in a set traversal angle range, and finally determining the position, the size and the posture of an optimal three-dimensional boundary frame of a target according to the position of the center of the plane rectangular frame corresponding to the optimal course angle under a laser radar coordinate system;
Figure BDA0003269200040000043
in the formula, xriIs the coordinate of x, y of the rotated ith point cloud obtained in step S3riThe coordinate of y of the rotated ith point cloud obtained in step S3, n is the total number of point clouds in the clustered point cloud data, and the heading angle is the angle between the orientation of the target and the x-axis of the lidar coordinate system.
Further, the objective function value L is calculated by equation (17) in the case where the point cloud distribution shape is L-shaped, and the objective function value L' is calculated by equation (25) in the case where the point cloud distribution shape is non-L-shaped:
L=(nd/S)2/D (17)
L′=1/(S*Dn) (25)
in the formula, ndIs the dot number extreme difference, S is the area of the planar rectangular frame, D and DnThe distances from the point clouds of which the point cloud distribution shapes are L-shaped and non-L-shaped to the plane rectangular frame are respectively equal.
Further, ndBy the calculation of formula (7), S by the calculation of formula (6), D by the calculation of formula (16), DnCalculated by formula (24);
nd=nmax-nmin (7)
S=a*b (6)
a=xmax-xmi (4)
b=ymax-ymin (5)
Figure BDA0003269200040000051
Figure BDA0003269200040000052
in the formula, nmaxNumber of point clouds of quadrant in which the point cloud feature point is located, nminNumber of point clouds of quadrant diagonal to the quadrant in which the point cloud feature points are located, xmaxIs xriMaximum value of (1), xminIs xriMinimum value of (1), xmaxIs yriMaximum value of (1), yminIs yriMinimum value of, d'1iThe closest distance, d ', of the ith point cloud described by formula (20) to the planar rectangular box at the first quadrant'2iThe closest distance, d ', of the ith point cloud described by formula (21) to the planar rectangular box at the second quadrant'3iThe closest distance, d ', of the ith point cloud described by formula (22) to the planar rectangular box at the third quadrant'4iThe closest distance of the ith point cloud described for equation (23) to the planar rectangular frame at the fourth quadrant, d1iThe closest distance of the ith point cloud to the plane rectangular frame in the first quadrant, d2iThe closest distance of the ith point cloud to the planar rectangular frame in the second quadrant, d3iThe closest distance of the ith point cloud to the plane rectangular frame in the third quadrant, d4iIs the closest distance of the ith point cloud to the planar rectangular box in the fourth quadrant.
Due to the adoption of the technical scheme, the invention has the following advantages:
firstly, the point cloud data has shielding property, so that the point cloud is the most on the nearest surface of the laser radar, and a corresponding target function is designed by utilizing the characteristic, so that a better boundary frame result is obtained.
Secondly, fitting the plane rectangular frame is converted into an optimization problem by using a method of traversing and searching the plane rectangular frame, and the traversing step length can be selected to be increased, so that the effect of accelerating calculation is achieved.
Drawings
FIG. 1 is a flowchart of a method for estimating a target shape and a heading according to an embodiment of the invention;
FIG. 2 is a diagram of a coordinate system definition provided by an embodiment of the present invention;
FIG. 3 is a schematic diagram of a course angle traversal process provided in the embodiment of the present invention;
fig. 4 is a plane rectangular frame point cloud coordinate system partition diagram provided in the embodiment of the present invention, where a is a point cloud feature point located in a first quadrant, b is a point cloud feature point located in a second quadrant, c is a point cloud feature point located in a third quadrant, and d is a point cloud feature point located in a fourth quadrant;
FIG. 5 is a partition diagram of a point cloud coordinate system with a planar rectangular frame according to an embodiment of the present invention, wherein the point cloud has a non-L-shaped distribution;
FIG. 6 is a diagram of a system for estimating a target shape and a heading according to an embodiment of the invention.
Detailed Description
In the drawings, the same or similar reference numerals are used to denote the same or similar elements or elements having the same or similar functions. Embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
In the description of the present invention, the terms "central", "longitudinal", "lateral", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", etc., indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience in describing the present invention and simplifying the description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed in a particular orientation, and be operated, and therefore, should not be construed as limiting the scope of the present invention.
The target shape and course estimation method provided by the embodiment of the invention comprises the following steps:
and step S1, acquiring original target point cloud data, and clustering to obtain clustered point cloud data.
The clustering method is not limited to the Euclidean clustering method or the non-Euclidean clustering method, and the embodiment of the invention only processes the clustered target point cloud, so the process of obtaining the clustered point cloud data is not described.
Step S2, the clustered point cloud data is described by a three-dimensional bounding box.
The clustered point cloud data comprises three-dimensional space points reflected by a target after the laser radar scans the surface of the clustered point cloud data, and each clustered point cloud data can be described through a three-dimensional boundary frame in a laser coordinate system, so that the coordinate values of x, y and z of the three-dimensional boundary frame correspond to the coordinate values of x, y and z of the target point cloud data. For example: when the laser radar is installed on a vehicle, the definition of three axes of a laser coordinate system is as follows: the positive direction of the x axis is the positive front of the vehicle, the positive direction of the y axis is the positive right-hand direction of the vehicle head, and the positive direction of the z axis is the upward direction vertical to the vehicle, namely the direction of the height of the vehicle. The heading angle θ is then the angle between the orientation of the target and the x-axis, θ0An initial value of the heading angle, for example 0.
As shown in fig. 2, the three-dimensional bounding box in fig. 2 depicts clustered point cloud data. The black dots are represented as point clouds in the clustered point cloud data, an origin O of a laser radar coordinate system is the position of a laser radar, a is used for describing the size of the clustered point cloud data along an x axis, b is used for describing the size of the clustered point cloud data along a y axis, and c is used for describing the size of the clustered point cloud data along a z axis.
Step S3, the x and y coordinates of each point cloud in the clustered point cloud data are rotated by a preset step Δ θ along a set direction. For example: Δ θ is 1 °. Fig. 3 shows the rotation in the counter-clockwise direction. However, it may also be rotated in a clockwise direction.
For example: the total number of point clouds in the clustered point cloud data is n, and the x and y coordinates of the ith point cloud are (x)i、yi) Then, the x and y coordinates of the ith point cloud after being rotated through step S3 are (x)ri、yri) If yes, the corresponding relation is expressed as formula (1), and θ is the current heading angle:
Figure BDA0003269200040000071
step S4, as shown in fig. 4, a two-dimensional rectangular coordinate system xO1y is established with the center O1 of the planar rectangular frame of the xoy plane corresponding to the three-dimensional bounding box as the origin to form four quadrants, and the four quadrants are represented by the point cloud feature point O2 (x) described by formula (2)c、yc) Determining the area S of a plane rectangular frame, the distance D between the point cloud and the plane rectangular frame and the point number extreme difference n in the quadrantdAnd further calculating a target function value L according to the shape presented by the point cloud distribution in the clustered point cloud data.
Figure BDA0003269200040000072
In the formula, xriIs the coordinate of x, y of the rotated ith point cloud obtained in step S3riFor the coordinates of y of the rotated ith point cloud obtained in step S3, n is the total number of point clouds in the clustered point cloud data.
Wherein, O1 (x)ac,yac) Can be calculated using the following formula (3) to formula (6):
Figure BDA0003269200040000073
in the formula, xmaxIs xriMaximum value of (1), xminIs xriMinimum value of (1), xmaxIs yriMaximum value of (1), yminIs yriMinimum value of (1).
Referring to fig. 2, a and b of the three-dimensional bounding box are calculated by the following equations (4) and (5):
a=xmax-xmin (4)
b=ymax-ymin (5)
the area S of the planar rectangular frame can be calculated by equation (6):
S=a*b (6)
referring to fig. 1, due to the shielding property of the point cloud, a rigid closed surface target (which may also be referred to as an obstacle) is prevalent on the road, and thus the shape feature presented by the distribution of the point cloud in the original point cloud data is approximately L-shaped. First, an implementation of step S4 will be described with an L-shaped point cloud distribution shape as an example.
As shown in fig. 4, according to the plane rectangular frame center O1 (x)ac,yac) And point cloud characteristic point O2 (x)c、yc) The positional relationship of (a) to the point cloud is divided into the following four cases:
wherein the black dots are point clouds and the black squares are calculated as O2. A in fig. 4 indicates that O2 is located in the first quadrant of xO1y, b in fig. 4 indicates that O2 is located in the second quadrant of xO1y, c in fig. 4 indicates that O2 is located in the third quadrant of xO1y, and d in fig. 4 indicates that O2 is located in the fourth quadrant of xO1 y. The quadrant in which O2 is located is the quadrant with the largest number of point clouds, and the opposite quadrant is the quadrant with the smallest number of point clouds, and the difference between the maximum number of point clouds and the minimum number of point clouds in the quadrant is calculated, that is, the point number extreme difference n represented by formula (7)d
nd=nma-nmin (7)
Wherein n ismaxThe total point number n of point clouds of the quadrant where the point cloud characteristic points are locatedminThe total point number of the point clouds in the quadrant corresponding to the point cloud characteristic points.
Because the point cloud distribution is different in four cases, the methods for calculating the rectangular frame and D of the distance between the point cloud and the plane are different, and the following situation-based exposition is carried out:
in the first case, shown as a in FIG. 4, O2 is located in the first quadrant of xO1 y: in order to obtain a planar rectangular frame closer to each point cloud, the distance between the ith point cloud and the nearest side of the planar rectangular frame (hereinafter referred to as the nearest distance of the planar rectangular frame) is calculated. Since the ith point cloud may be located in any one of the first quadrant to the fourth quadrant, different rules are specified for the different quadrants, and most of the point clouds are distributed in the first quadrant, the second quadrant, and the fourth quadrant in this case, and thus the following calculation rules are specified for the three quadrants, and the calculation formula is shown in the following formula (8):
Figure BDA0003269200040000081
wherein d is1iThe closest distance of the ith point cloud to the plane rectangular frame in the first quadrant, d2iThe closest distance of the ith point cloud to the planar rectangular frame in the second quadrant, d4iIs the closest distance of the ith point cloud to the planar rectangular box in the fourth quadrant.
For the third quadrant, because the number of the point clouds in the quadrant is the minimum under the reasonable partition of the plane rectangular frame coordinate, in order to make the number of the point clouds in the quadrant as small as possible, a penalty strategy is adopted when the plane rectangular frame distance is calculated for the point clouds in the quadrant, the distance between the point clouds in the quadrant and the farthest edge of the plane rectangular frame is selected and calculated, so that the ratio of the numerical value in the distance between the point clouds and the plane rectangular frame is increased, meanwhile, the numerical value is amplified through a penalty coefficient, a penalty coefficient k (k >1) is set firstly, the value of k is larger than 1, and k is 2 in the invention, but is not limited to 2. Therefore, the point cloud in the third quadrant has a calculation formula in which the distance from the farthest edge is calculated as the following formula (9):
d3i=k*max{|xri-xmin|,|yri-ymin|} (9)
in the second case, shown as b in FIG. 4, O2 is located in the second quadrant of xO1 y: in this case, most of the point clouds are distributed in the first quadrant, the second quadrant, and the third quadrant, and therefore, the following calculation rule is specified for these three quadrants, and the calculation formula is as follows (10):
Figure BDA0003269200040000091
wherein d is1iThe closest distance of the ith point cloud to the plane rectangular frame in the first quadrant, d2iThe closest distance of the ith point cloud to the planar rectangular frame in the second quadrant, d3iThe closest distance of the ith point cloud to the planar rectangular frame in the third quadrant.
And in the fourth quadrant, the point cloud number in the quadrant is the minimum, a penalty strategy is adopted when the plane rectangular frame distance is calculated for the point cloud in the quadrant, the distance between the point cloud in the quadrant and the farthest edge of the plane rectangular frame is selected and calculated, meanwhile, the numerical value is amplified through a penalty coefficient, the penalty coefficient k is set to be 2, but not limited to be 2, and therefore the ratio of the numerical value in the distance sum of the point cloud and the plane rectangular frame is increased. The calculation formula is as follows (11):
d4i=k*max{|xri-xmin|,|yri-ymax|} (11)
in the third case, shown as c in FIG. 4, O2 is located in the third quadrant of xO1 y: in this case, most of the point clouds are distributed in the second quadrant, the third quadrant, and the fourth quadrant, and therefore the following calculation rule is specified for these three quadrants, and the calculation formula is as follows (12):
Figure BDA0003269200040000092
wherein d is2iThe closest distance of the ith point cloud to the planar rectangular frame in the second quadrant, d3iThe closest distance of the ith point cloud to the plane rectangular frame in the third quadrant, d4iIs the closest distance of the ith point cloud to the planar rectangular box in the fourth quadrant.
For the first quadrant, the point cloud number in the quadrant is the minimum, a penalty strategy is adopted when the plane rectangular frame distance is calculated for the point cloud in the quadrant, the distance between the point cloud in the quadrant and the farthest edge of the plane rectangular frame is selected and calculated, meanwhile, the numerical value is amplified through a penalty coefficient, the penalty coefficient k is set to be 2, but not limited to be 2, and therefore the ratio of the numerical value in the distance sum of the point cloud to the plane rectangular frame is increased. The calculation formula is as follows (13):
d1i=k*max{|xri-xmax|,|yri-ymax|} (13)
in the fourth case, shown as d in FIG. 4, O2 is located in the fourth quadrant of xO1 y: in this case, most of the point clouds are distributed in the first quadrant, the third quadrant, and the fourth quadrant, and therefore the following calculation rule is specified for these three quadrants, and the calculation formula is as follows (14):
Figure BDA0003269200040000101
wherein d is1iThe closest distance of the ith point cloud to the planar rectangular frame in the second quadrant, d3iThe closest distance of the ith point cloud to the plane rectangular frame in the third quadrant, d4iIs the closest distance of the ith point cloud to the planar rectangular box in the fourth quadrant.
And in the second quadrant, the point cloud number in the quadrant is the minimum, a penalty strategy is adopted when the plane rectangular frame distance is calculated for the point clouds in the quadrant, the distance between the point clouds in the quadrant and the farthest edge of the plane rectangular frame is selected and calculated, meanwhile, the numerical value is amplified through a penalty coefficient, the penalty coefficient k is set to be 2, but not limited to be 2, and therefore the ratio of the numerical value in the distance sum of the point clouds and the plane rectangular frame is increased. The calculation formula is as follows (15):
d2i=k*max{|xri-xma|,|yri-ymin|} (15)
then, the four conditions are combined to obtain d under different point cloud distributions1i、d2i、d3iAnd d4iAnd calculating the distance D from the point cloud to the plane rectangular frame according to the following calculation formula (16):
Figure BDA0003269200040000102
then according to the area S of the plane rectangular frame, the distance D between the point cloud and the plane rectangular frame and the point number extreme value difference ndThe objective function L, L designed by using the three indexes can be calculated by using the following formula (17):
L=(nd/S)2/D (17)
the larger the value of the objective function is, the more reasonable the plane rectangular frame is, namely, the influence of three factors is, the smaller the area S of the plane rectangular frame is, the better the area S of the plane rectangular frame is, the smaller the distance from the point cloud to the plane rectangular frame and the smaller the distance D of the point cloud to the plane rectangular frame are, the better the point number extreme value difference n isdThe larger the better. And after the objective function value L is calculated, recording the currently traversed course angle theta and the objective function value L.
Of course, L may be calculated by modifying equation (17). For example, the power of 2 of the formula (17) may be changed to the power of 1 or 3 or other powers.
Step S5, judging whether the current course angle is in the set traversal angle range, if yes, returning to step S3; otherwise, the process proceeds to step S6. Wherein, the range of the set traverse angle is [ -180 degrees, 180 degrees ] or (-180 degrees, 180 degrees ].
Step S6, selecting the heading angle corresponding to the maximum value L as the optimal heading angle θ'.
And step S7, determining the position, size and posture of the optimal three-dimensional boundary frame of the target according to the position of the center of the plane rectangular frame corresponding to the optimal course angle theta' under the laser radar coordinate system.
Center coordinates (x ') of plane rectangular frame corresponding to optimal course angle theta ' after rotation 'ac,y′ac) As shown in the following formula (18):
Figure BDA0003269200040000111
wherein the maximum value of x in the coordinate values of the target point cloud after rotating by the optimal course angle theta 'angle is x'maxX 'is a minimum value of x'miY is the mostGreat value y'maxMinimum value y 'to y'min
Then the calculation formula of the coordinates (x, y, z) of the center point and the dimensions (a, b, c) on the three axes of the three-dimensional bounding box in the laser radar coordinate system is shown as the following formula (19), so as to determine the position, the dimensions and the posture of the optimal bounding box of the object.
Figure BDA0003269200040000112
In another embodiment, the implementation of step S4 is described in which the point cloud distribution of the target obtained by the laser radar is non-L-shaped for the target with a hole in the center of the equiaxed of the tree or the target such as the vehicle located right in front of the laser radar.
As shown in fig. 5, the point clouds are distributed in any one of the first quadrant to the fourth quadrant, and in order to obtain a planar rectangular frame closer to each point cloud, the distance between the ith point cloud and the nearest edge of the point cloud is calculated, and for the difference of the point cloud distribution in the four quadrants, the method for calculating the sum D of the point cloud and the planar rectangular frame is different, and the following four cases are illustrated:
a. the point cloud coordinates are located in a first quadrant, wherein the ith point cloud is the closest distance d 'to the planar rectangular frame in the first quadrant'1iThe formula (2) is as follows:
d′1i=min{|xri-xmax|,|yri-ymax|} (20)
b. the point cloud coordinates are located in a second quadrant, wherein the ith point cloud is the closest distance d 'to the planar rectangular frame in the second quadrant'2iThe formula (2) is as follows:
d′2i=min{|xri-xmax|,|yri-ymin|} (21)
c. the point cloud coordinates are located in a third quadrant, wherein the ith point cloud is the closest distance d 'to the planar rectangular frame in the third quadrant'3iThe formula (2) is as follows:
d′3i=min{|xri-xmin|,|yri-ymin|} (22)
d. the point cloud coordinates are located in a fourth quadrant, wherein the closest distance d 'between the ith point cloud and the plane rectangular frame is in the fourth quadrant'4iThe formula (2) is as follows:
d′4i=min{|xri-xmin|,|yri-ymax|} (23)
by combining the four conditions, d 'of different point clouds under different quadrants can be obtained'1i、d′2i、d′3iAnd d'4iFrom this, the closest distance sum D of the point to the plane rectangular frame is calculatednThe calculation formulas for the respective cases are shown in the following formula (24):
Figure BDA0003269200040000121
then, the area S of the plane rectangular frame and the nearest distance sum D between the point and the plane rectangular frame are utilizednThe design objective function L ', L' can be calculated by the following equation (25):
L′=1/(S*Dn) (25)
the larger the value of L', the more reasonable the plane rectangular frame is, the smaller the area S of the plane rectangular frame is, the better the shortest distance between the point and the plane rectangular frame is, and the sum DnThe smaller the better. And after the value of the objective function is calculated, recording the currently traversed course angle theta and the objective function value L.
In summary, the three-dimensional point cloud shape estimation method described in the present invention is applicable to all collected three-dimensional point cloud data, and therefore, all changes and modifications of the three-dimensional point cloud shape estimation method without departing from the general concept of the present invention shall fall within the protection scope of the present invention.
As shown in fig. 6, an embodiment of the present invention further provides a target shape and heading estimation system, including:
the data analysis module is used for acquiring original target point cloud data;
the point cloud clustering module is used for clustering the target point cloud data to obtain clustered point cloud data;
a target shape and course estimation module for describing the clustered point cloud data as a three-dimensional bounding box, rotating the x and y coordinates of each point cloud in the clustered point cloud data along a set direction by a preset step length, then establishing a two-dimensional rectangular coordinate system xO1y with the center O1 of the planar rectangular frame of the xoy plane corresponding to the three-dimensional bounding box as the origin to form four quadrants, and then according to the point cloud feature point O2 (x) described by the formula (2)c、yc) And determining the area of a plane rectangular frame, the distance from the point cloud to the plane rectangular frame and the point number extreme value difference in the quadrant, further calculating a target function value L according to the shape presented by the point cloud distribution in the clustered point cloud data, selecting a heading angle corresponding to the maximum L as an optimal heading angle in a set traversal angle range, and finally determining the position, the size and the posture of an optimal three-dimensional boundary frame of the target according to the position of the center of the plane rectangular frame corresponding to the optimal heading angle under a laser radar coordinate system.
In one embodiment, the objective function value L is calculated by equation (17) in the case where the point cloud distribution shape is L-shaped, and the objective function value L' is calculated by equation (25) in the case where the point cloud distribution shape is non-L-shaped.
In one embodiment, ndBy the calculation of formula (7), S by the calculation of formula (6), D by the calculation of formula (16), DnCalculated by equation (24).
In the running process of a vehicle, a laser radar configured at a vehicle-mounted end collects point cloud data of the laser radar to the surrounding road environment, a data analysis module is used for acquiring point cloud information acquired by the laser radar, the point cloud information is input to a point cloud clustering module, target shape and course estimation is carried out on the clustered target point cloud, and the position, the size and the course information of a boundary frame of all targets are calculated and output, so that the system realizes the steps of the target shape and course estimation method.
Finally, it should be pointed out that: the above examples are only for illustrating the technical solutions of the present invention, and are not limited thereto. Those of ordinary skill in the art will understand that: modifications can be made to the technical solutions described in the foregoing embodiments, or some technical features may be equivalently replaced; such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A method for estimating a shape and a heading of an object, comprising:
step S1, acquiring original target point cloud data, and clustering to obtain clustered point cloud data;
step S2, describing the clustered point cloud data through a three-dimensional bounding box;
step S3, rotating the x and y coordinates of each point cloud in the clustered point cloud data along a set direction by a preset step length;
step S4, establishing a two-dimensional rectangular coordinate system xO1y by taking the center O1 of the plane rectangular frame of the xoy plane corresponding to the three-dimensional bounding box as an origin to form four quadrants, and performing point cloud feature point O2 (x) described by formula (2)c、yc) Determining the area of a plane rectangular frame, the distance sum from the point cloud to the plane rectangular frame and the point number extreme value difference of the quadrant, and further calculating a target function value according to the shape presented by point cloud distribution in the clustered point cloud data;
Figure FDA0003269200030000011
in the formula, xriIs the coordinate of x, y of the rotated ith point cloud obtained in step S3riThe coordinates of y of the rotated ith point cloud obtained in step S3, and n is the total number of point clouds in the clustered point cloud data;
step S5, judging whether the current course angle is in the set traversal angle range, if yes, returning to step S3; otherwise, go to step S6; the course angle is an included angle between the orientation of the target and an x axis of a laser radar coordinate system;
step S6, selecting the course angle corresponding to the maximum objective function value as the optimal course angle;
and step S7, determining the position, size and posture of the optimal three-dimensional boundary frame of the target according to the position of the center of the plane rectangular frame corresponding to the optimal course angle under the laser radar coordinate system.
2. The method for estimating shape and heading as claimed in claim 1, wherein in step S4, the objective function value L is calculated by equation (17) when the point cloud distribution shape is L-shaped, and the objective function value L' is calculated by equation (25) when the point cloud distribution shape is non-L-shaped:
L=(nd/S)2/D (17)
L′=1/(S*Dn) (25)
in the formula, ndIs the dot number extreme difference, S is the area of the planar rectangular frame, D and DnThe distances from the point clouds of which the point cloud distribution shapes are L-shaped and non-L-shaped to the plane rectangular frame are respectively equal.
3. The method for estimating shape and heading of an object as claimed in claim 2, wherein in step S4, n isdCalculated by equation (7):
nd=nmax-nmin (7)
in the formula, nmaxNumber of point clouds of quadrant in which the point cloud feature point is located, nminThe point cloud number of the quadrant which is diagonal to the quadrant in which the point cloud feature points are located.
4. The method for estimating the shape and heading of the object as claimed in claim 2, wherein in step S4, S is calculated by equation (6):
S=a*b (6)
a=xmax-xmin (4)
b=ymax-ymin (5)
in the formula, xmaxIs xriMaximum value of (1), xminIs xriMinimum value of (1), xmaxIs yriMaximum value of (1), yminIs yriMinimum value of (1).
5. The method for estimating shape and heading of an object as claimed in claim 3 or 4, wherein in step S4, DnCalculated by equation (24):
Figure FDA0003269200030000021
d′1i=min{|xri-xmax|,|yri-ymax|} (20)
d′2i=min{|xri-xmax|,|yri-ymin|} (21)
d′3i=min{|xri-xmin|,|yri-ymin|} (22)
d′4i=min{|xri-xmin|,|yri-ymax|} (23)
of formula (II) to'1iThe closest distance, d ', of the ith point cloud described by formula (20) to the planar rectangular box at the first quadrant'2iThe closest distance, d ', of the ith point cloud described by formula (21) to the planar rectangular box at the second quadrant'3iThe closest distance, d ', of the ith point cloud described by formula (22) to the planar rectangular box at the third quadrant'4iThe closest distance of the ith point cloud described for equation (23) to the planar rectangular frame at the fourth quadrant.
6. The method for estimating the shape and heading of the object according to any one of claims 2 to 4, wherein in step S4, D is calculated by equation (16):
Figure FDA0003269200030000022
in the formula (d)1iThe closest distance of the ith point cloud to the plane rectangular frame in the first quadrant, d2iIs as followsThe closest distance of the i point clouds to the plane rectangular frame in the second quadrant, d3iThe closest distance of the ith point cloud to the plane rectangular frame in the third quadrant, d4iIs the closest distance of the ith point cloud to the planar rectangular box in the fourth quadrant.
7. The method of claim 6, wherein d is the first quadrant of 02, d1i、d2i、d3iAnd d4iObtained by calculation of the following formulae (8) and (9):
Figure FDA0003269200030000031
d3i=k*max{|xri-xmin|,|yri-ymin|} (9)
in the case of O2 being in the second quadrant, d1i、d2i、d3iAnd d4iObtained by calculation of the following formulae (10) and (11):
Figure FDA0003269200030000032
d4i=k*max{|xri-xmin|,|yri-ymax|} (11)
in case of O2 being in the third quadrant, d1i、d2i、d3iAnd d4iObtained by calculation of the following formulae (12) and (13):
Figure FDA0003269200030000033
d1i=k*max{|xri-xmax|,|yri-ymax|} (13)
in case of O2 being in the fourth quadrant, d1i、d2i、d3iAnd d4iObtained by calculation of the following formulae (14) and (15):
Figure FDA0003269200030000034
d2i=k*max{|xri-xmax|,|yri-ymin|} (15)。
8. a target shape and heading estimation system, comprising:
the data analysis module is used for acquiring original target point cloud data;
the point cloud clustering module is used for clustering the target point cloud data to obtain clustered point cloud data;
a target shape and course estimation module for describing the clustered point cloud data as a three-dimensional bounding box, rotating the x and y coordinates of each point cloud in the clustered point cloud data along a set direction by a preset step length, then establishing a two-dimensional rectangular coordinate system xO1y with the center O1 of the planar rectangular frame of the xoy plane corresponding to the three-dimensional bounding box as the origin to form four quadrants, and then according to the point cloud feature point O2 (x) described by the formula (2)c、yc) Determining the area of a plane rectangular frame, the distance sum between the point cloud and the plane rectangular frame and the point number extreme value difference in the quadrant, further calculating a target function value according to the shape presented by the point cloud distribution in the clustered point cloud data, selecting a course angle corresponding to the maximum target function value as an optimal course angle in a set traversal angle range, and finally determining the position, the size and the posture of an optimal three-dimensional boundary frame of a target according to the position of the center of the plane rectangular frame corresponding to the optimal course angle under a laser radar coordinate system;
Figure FDA0003269200030000041
in the formula, xriIs the coordinate of x, y of the rotated ith point cloud obtained in step S3riFor the rotated ith point cloud obtained in step S3y, n is the total number of point clouds in the clustered point cloud data, and the course angle is the included angle between the orientation of the target and the x axis of the laser radar coordinate system.
9. The objective shape and heading estimation system of claim 8, wherein the objective function value L is calculated by equation (17) in a case where the point cloud distribution shape is L-shaped, and the objective function value L' is calculated by equation (25) in a case where the point cloud distribution shape is non-L-shaped:
L=(nd/S)2/D (17)
L′=1/(S*Dn) (25)
in the formula, ndIs the dot number extreme difference, S is the area of the planar rectangular frame, D and DnThe distances from the point clouds of which the point cloud distribution shapes are L-shaped and non-L-shaped to the plane rectangular frame are respectively equal.
10. The object shape and heading estimation system of claim 9, wherein n isdBy the calculation of formula (7), S by the calculation of formula (6), D by the calculation of formula (16), DnCalculated by formula (24);
nd=nmax-nmin (7)
S=a*b (6)
a=xmax-xmin (4)
b=ymax-ymin (5)
Figure FDA0003269200030000051
Figure FDA0003269200030000052
in the formula, nmaNumber of point clouds of quadrant in which the point cloud feature point is located, nminOf quadrants diagonal to the quadrant in which the point cloud characteristic points lieNumber of point clouds, xmaxIs xriMaximum value of (1), xminIs xriMinimum value of (1), xmaxIs yriMaximum value of (1), yminIs yriMinimum value of, d'1iThe closest distance, d ', of the ith point cloud described by formula (20) to the planar rectangular box at the first quadrant'2iThe closest distance, d ', of the ith point cloud described by formula (21) to the planar rectangular box at the second quadrant'3iThe closest distance, d ', of the ith point cloud described by formula (22) to the planar rectangular box at the third quadrant'4iThe closest distance of the ith point cloud described for equation (23) to the planar rectangular frame at the fourth quadrant, d1iThe closest distance of the ith point cloud to the plane rectangular frame in the first quadrant, d2iThe closest distance of the ith point cloud to the planar rectangular frame in the second quadrant, d3iThe closest distance of the ith point cloud to the plane rectangular frame in the third quadrant, d4iIs the closest distance of the ith point cloud to the planar rectangular box in the fourth quadrant.
CN202111097076.7A 2021-09-18 2021-09-18 Target shape and course estimation method and system Active CN113807442B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111097076.7A CN113807442B (en) 2021-09-18 2021-09-18 Target shape and course estimation method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111097076.7A CN113807442B (en) 2021-09-18 2021-09-18 Target shape and course estimation method and system

Publications (2)

Publication Number Publication Date
CN113807442A true CN113807442A (en) 2021-12-17
CN113807442B CN113807442B (en) 2022-04-19

Family

ID=78895927

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111097076.7A Active CN113807442B (en) 2021-09-18 2021-09-18 Target shape and course estimation method and system

Country Status (1)

Country Link
CN (1) CN113807442B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105223583A (en) * 2015-09-10 2016-01-06 清华大学 A kind of target vehicle course angle computing method based on three-dimensional laser radar
CN107782240A (en) * 2017-09-27 2018-03-09 首都师范大学 A kind of two dimensional laser scanning instrument scaling method, system and device
CN108549084A (en) * 2018-01-30 2018-09-18 西安交通大学 A kind of target detection based on sparse two-dimensional laser radar and Attitude estimation method
CN110906924A (en) * 2019-12-17 2020-03-24 杭州光珀智能科技有限公司 Positioning initialization method and device, positioning method and device and mobile device
CN112614186A (en) * 2020-12-28 2021-04-06 上海汽车工业(集团)总公司 Target pose calculation method and calculation module
WO2021074660A1 (en) * 2019-10-18 2021-04-22 日産自動車株式会社 Object recognition method and object recognition device
CN113239726A (en) * 2021-04-06 2021-08-10 北京航空航天大学杭州创新研究院 Target detection method and device based on coloring point cloud and electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105223583A (en) * 2015-09-10 2016-01-06 清华大学 A kind of target vehicle course angle computing method based on three-dimensional laser radar
CN107782240A (en) * 2017-09-27 2018-03-09 首都师范大学 A kind of two dimensional laser scanning instrument scaling method, system and device
CN108549084A (en) * 2018-01-30 2018-09-18 西安交通大学 A kind of target detection based on sparse two-dimensional laser radar and Attitude estimation method
WO2021074660A1 (en) * 2019-10-18 2021-04-22 日産自動車株式会社 Object recognition method and object recognition device
CN110906924A (en) * 2019-12-17 2020-03-24 杭州光珀智能科技有限公司 Positioning initialization method and device, positioning method and device and mobile device
CN112614186A (en) * 2020-12-28 2021-04-06 上海汽车工业(集团)总公司 Target pose calculation method and calculation module
CN113239726A (en) * 2021-04-06 2021-08-10 北京航空航天大学杭州创新研究院 Target detection method and device based on coloring point cloud and electronic equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JIXIAN ZHANG ET AL.: "Automatic Vehicle Extraction from Airborne LiDAR Data Using an Object-Based Point Cloud Analysis Method", 《REMOTE SENSING 》 *
王肖 等: "基于三维激光雷达的智能车辆目标参数辨识", 《汽车工程》 *

Also Published As

Publication number Publication date
CN113807442B (en) 2022-04-19

Similar Documents

Publication Publication Date Title
CN109752701B (en) Road edge detection method based on laser point cloud
CN110148185B (en) Method and device for determining coordinate system conversion parameters of imaging equipment and electronic equipment
US10859684B1 (en) Method and system for camera-lidar calibration
KR102257610B1 (en) EXTRINSIC CALIBRATION METHOD OF PLURALITY OF 3D LiDAR SENSORS FOR AUTONOMOUS NAVIGATION SYSTEM
Kang et al. Automatic targetless camera–lidar calibration by aligning edge with gaussian mixture model
CN110766758B (en) Calibration method, device, system and storage device
CN111273312B (en) Intelligent vehicle positioning and loop detection method
CN112464812B (en) Vehicle-based concave obstacle detection method
CN110992424B (en) Positioning method and system based on binocular vision
CN111476798B (en) Vehicle space morphology recognition method and system based on contour constraint
CN114740493A (en) Road edge detection method based on multi-line laser radar
CN113269889B (en) Self-adaptive point cloud target clustering method based on elliptical domain
CN115079143A (en) Multi-radar external parameter rapid calibration method and device for double-axle steering mine card
CN113807442B (en) Target shape and course estimation method and system
CN110554405B (en) Normal scanning registration method and system based on cluster combination
Morris et al. A view-dependent adaptive matched filter for ladar-based vehicle tracking
Sun et al. Automatic targetless calibration for LiDAR and camera based on instance segmentation
Li et al. A fast segmentation method of sparse point clouds
CN117029870A (en) Laser odometer based on road surface point cloud
CN116385997A (en) Vehicle-mounted obstacle accurate sensing method, system and storage medium
CN116299313A (en) Laser radar-based intelligent vehicle passable area detection method
CN116400349A (en) Calibration method of low-resolution millimeter wave radar and optical camera
CN115496782A (en) LIDAR to LIDAR alignment and LIDAR to vehicle alignment online verification
CN115267756A (en) Monocular real-time distance measurement method based on deep learning target detection
CN115100287A (en) External reference calibration method and robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant