CN113761990A - Road boundary detection method and device, unmanned vehicle and storage medium - Google Patents

Road boundary detection method and device, unmanned vehicle and storage medium Download PDF

Info

Publication number
CN113761990A
CN113761990A CN202010538071.2A CN202010538071A CN113761990A CN 113761990 A CN113761990 A CN 113761990A CN 202010538071 A CN202010538071 A CN 202010538071A CN 113761990 A CN113761990 A CN 113761990A
Authority
CN
China
Prior art keywords
boundary
position information
target
road boundary
road
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010538071.2A
Other languages
Chinese (zh)
Other versions
CN113761990B (en
Inventor
陈建兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Qianshi Technology Co Ltd
Original Assignee
Beijing Jingdong Qianshi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Qianshi Technology Co Ltd filed Critical Beijing Jingdong Qianshi Technology Co Ltd
Priority to CN202010538071.2A priority Critical patent/CN113761990B/en
Publication of CN113761990A publication Critical patent/CN113761990A/en
Application granted granted Critical
Publication of CN113761990B publication Critical patent/CN113761990B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Navigation (AREA)

Abstract

The embodiment of the invention discloses a road boundary detection method, a road boundary detection device, an unmanned vehicle and a storage medium, wherein the method comprises the following steps: acquiring current position information, and determining reference position information of road boundary detection based on the current position information; determining target boundary information associated with the reference position information from a pre-constructed axis-aligned bounding rectangular tree; and determining the width of the road boundary according to the target boundary information and the reference position information. The road boundary detection method provided by the embodiment of the invention detects the road boundary currently driving through the pre-constructed axis alignment surrounding rectangular tree and the current position information, thereby ensuring the accuracy of the road boundary detection result on the basis of improving the real-time performance of the road boundary detection.

Description

Road boundary detection method and device, unmanned vehicle and storage medium
Technical Field
The embodiment of the invention relates to the technical field of boundary detection, in particular to a road boundary detection method and device, an unmanned vehicle and a storage medium.
Background
During unmanned vehicle automatic driving, the unmanned vehicle generally needs to acquire the boundary of the surrounding road from the map so as to provide a route planning range for the unmanned vehicle based on the boundary of the surrounding road. At present, two methods for calculating the width of a road boundary exist: firstly, directly acquiring the road width in a map; and secondly, adopting the shortest distance from the point to the line segment as the boundary width.
In the process of implementing the invention, the inventor finds that at least the following technical problems exist in the prior art: the accuracy for directly obtaining the width of the road in the map is poor, and the width of one road is often the same, which obviously does not meet the accuracy required by route planning; although the problem of accuracy of a calculation mode adopting the shortest distance from a point to a line segment as the boundary width is relieved to a certain extent, due to the search attribute, the real-time performance is poor, the calculation efficiency is low, and the results obtained by a calculation model in some areas with large boundary form change cannot meet the passing requirements of unmanned vehicles.
Disclosure of Invention
The embodiment of the invention provides a road boundary detection method and device, an unmanned vehicle and a storage medium, solves the technical problems of inaccuracy and poor real-time performance of road boundary detection in the prior art, and ensures the accuracy of a road boundary detection result on the basis of improving the real-time performance of road boundary detection.
In a first aspect, an embodiment of the present invention provides a road boundary detection method, including:
acquiring current position information, and determining reference position information of road boundary detection based on the current position information;
determining target boundary information associated with the reference position information from a pre-constructed axis-aligned bounding rectangular tree;
and determining the width of the road boundary according to the target boundary information and the reference position information.
In a second aspect, an embodiment of the present invention further provides a road boundary detection apparatus, including:
the reference position determining module is used for acquiring current position information and determining reference position information of road boundary detection based on the current position information;
the target boundary determining module is used for determining target boundary information associated with the reference position information from a pre-constructed axis-aligned bounding rectangular tree;
and the boundary width determining module is used for determining the road boundary width according to the target boundary information and the reference position information.
In a third aspect, an embodiment of the present invention further provides an unmanned vehicle, where the unmanned vehicle includes:
one or more processors;
storage means for storing one or more programs
When the one or more programs are executed by the one or more processors, the one or more processors implement the road boundary detection method as provided by any of the embodiments of the present invention.
In a sixth aspect, the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the road boundary detection method provided in any embodiment of the present invention.
The embodiment of the invention determines the reference position information of road boundary detection based on the current position information by acquiring the current position information; the target boundary information associated with the reference position information is determined from the pre-constructed axial alignment surrounding rectangular tree, the target boundary information is determined through the pre-constructed axial symmetry surrounding rectangular tree, the searching process for determining the target boundary information is simplified, the road boundary width is determined according to the target boundary information and the reference position information, the calculated amount of the road boundary width is simplified, the accuracy of the road boundary width is improved, and the accuracy of the road boundary detection result is ensured on the basis of improving the real-time performance of the road boundary detection.
Drawings
Fig. 1 is a flowchart of a road boundary detection method according to an embodiment of the present invention;
fig. 2 is a flowchart of a road boundary detection method according to a second embodiment of the present invention;
fig. 3a is a flowchart of a road boundary detection method according to a third embodiment of the present invention;
FIG. 3b is a schematic diagram of a discretized road boundary provided by a third embodiment of the present invention;
fig. 3c is a schematic view of a road side boundary segment coding according to the third embodiment of the present invention;
FIG. 3d is a schematic diagram of an axis-aligned bounding rectangle according to a third embodiment of the present invention;
FIG. 3e is a schematic diagram of an axis-aligned bounding rectangle tree according to a third embodiment of the present invention;
FIG. 3f is a schematic diagram of determining a target boundary line segment according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of a road boundary detection apparatus according to a fourth embodiment of the present invention;
fig. 5 is a schematic structural diagram of an unmanned vehicle according to a fifth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of a road boundary detection method according to an embodiment of the present invention. The embodiment is applicable to the situation when detecting the road boundary, in particular to the situation when detecting the road boundary of the current driving road of the unmanned vehicle. The method may be performed by a road boundary detection device, which may be implemented in software and/or hardware, for example, which may be configured in an unmanned vehicle. As shown in fig. 1, the method comprises:
and S110, acquiring current position information, and determining reference position information of road boundary detection based on the current position information.
In this embodiment, the current location information may be current location information of the unmanned vehicle. Optionally, the current position information of the unmanned vehicle may be determined by a Global Positioning System (GPS) sensor in the unmanned vehicle, and after the current position information of the unmanned vehicle is acquired, reference position information of road boundary detection is determined based on the current position information of the unmanned vehicle, where the reference position information of the road boundary detection indicates a calculation reference position of a road boundary width, and is used to determine a matching range of the road boundary.
In the present embodiment, the form of the reference position information is not limited herein. Optionally, the reference position information may be a center point position of the unmanned vehicle, or may be a center line segment position of the unmanned vehicle. Preferably, the reference position information is a vehicle position line segment of the target vehicle. The target vehicle is an unmanned vehicle, that is, the reference position information is a center line segment position of the unmanned vehicle. The road boundary width is calculated by taking the position of the central line segment of the unmanned vehicle as reference position information, so that the distance between each part of the unmanned vehicle and the road boundary is considered in the calculation of the road boundary width, and the calculation of the road boundary width is more reasonable and more accurate.
And S120, determining target boundary information associated with the reference position information from the pre-constructed axis-aligned bounding rectangle tree.
The method aims to solve the technical problem that in the prior art, the distance between each road boundary line segment and reference position information needs to be calculated, so that the real-time performance is poor. In this embodiment, the road boundary in the map is constructed in advance as an Axis Alignment Bounding rectangle tree (AABB tree), a part of the road boundary line segments is selected from all the road boundary line segments based on the Axis Alignment Bounding rectangle tree constructed in advance as the target boundary information, and the road boundary width is calculated based on the target boundary information. Compared with the prior art that the calculation method for calculating the road boundary widths corresponding to all the road boundary segments is adopted, the calculation amount of the road boundary widths is reduced, and the real-time performance of the road boundary detection is improved.
Alternatively, the target boundary information associated with the reference position information may be road boundary information within a reference matching range of a set size corresponding to the reference position information. In one embodiment of the present invention, determining target boundary information associated with reference position information from a pre-constructed axis-aligned bounding rectangular tree includes: determining a reference matching range according to the reference position information; acquiring a rectangular coverage area corresponding to each axis alignment bounding rectangle in the axis alignment bounding rectangle tree, and taking the axis alignment bounding rectangle corresponding to the rectangular coverage area with intersection with the reference matching range as a target axis alignment bounding rectangle; and taking the road boundary line segment corresponding to the target axis alignment bounding rectangle as target boundary information.
The AABB tree provides a function of determining line segments associated with the specified areas according to the query method. In this embodiment, the above-described query function of the AABB tree is used to find a road boundary line segment associated with the reference matching range in the AABB tree as the target boundary information, with the reference matching range as the specified area. Specifically, a rectangular coverage area corresponding to each axis-aligned bounding rectangle in the AABB tree is obtained, and a road boundary line segment corresponding to an axisymmetric bounding rectangle having an intersection with the reference matching range is used as target boundary information. The current position information, the reference matching range, the rectangular coverage area, and the like may be determined based on the map coordinate system.
Alternatively, the reference matching range may be determined according to the observable range of the unmanned vehicle. For example, the set radius of the reference matching range may be determined according to the observable range of the unmanned vehicle. After the reference position information is determined, the center of the reference matching range is determined based on the reference position, and a circular area with the center point as the center point and the set radius as the radius is used as the reference matching range. If the reference position information is the position of the central point of the unmanned vehicle, the reference position information can be directly used as the center of the reference matching range; if the reference position information is the center line segment position of the unmanned vehicle, the center point of the center line segment position may be used as the center of the reference matching range.
And S130, determining the width of the road boundary according to the target boundary information and the reference position information.
In the present embodiment, after the target boundary information is determined, the road boundary width may be calculated from the target boundary information and the reference position information. The calculation mode of the road boundary width can be set according to actual requirements. Optionally, the total width of the road boundary, that is, the width between the left road boundary and the right road boundary, may be calculated according to the target boundary information; the left road boundary width of the unmanned vehicle from the left road boundary and the right road boundary width from the right road boundary may be calculated from the reference position information and the target boundary information, respectively.
In both of the above two road boundary width calculation methods, the left road boundary and the right road boundary need to be determined. Alternatively, the position relationship between the road boundary line segment and the reference position information may be determined according to the position information (e.g., line segment coordinates) of each road boundary line segment in the target boundary information and the reference position information (e.g., reference coordinates), the road boundary line segment located on the left side of the reference position information may be used as the left side road boundary, and the road boundary line segment located on the right side of the reference position information may be used as the right side road boundary. After the left and right road boundaries are determined, the total width of the road boundaries or the left and right road boundary widths may be calculated based on the left and right road boundaries.
Generally, in order to improve the accuracy of road planning, it is necessary to determine the width of an unmanned vehicle from a left road boundary to a left road boundary and the width of a right road boundary to a right road boundary. In one embodiment of the present invention, determining a road boundary width from target boundary information and reference position information includes: determining left boundary information and right boundary information in the target boundary information according to the boundary position information and the reference position information of the target boundary information; and determining the width of the left road boundary according to the left boundary information and the reference position information, and determining the width of the right road boundary according to the right boundary information and the reference position information. Optionally, after the left road boundary and the right road boundary are determined, the left road boundary is used as left boundary information, the right road boundary is used as right boundary information, the width of the left road boundary is calculated according to the left boundary information and the reference position information, and the width of the right road boundary is calculated according to the right boundary information and the reference position information.
Optionally, determining the left road boundary width according to the left boundary information and the reference position information, and determining the right road boundary width according to the right boundary information and the reference position information, includes: calculating the left road distance between each left boundary information and the reference position information, and taking the shortest left road distance as the width of the left road boundary; and calculating the right road distance between each piece of right boundary information and the reference position information, and taking the shortest right road distance as the width of the right road boundary. Specifically, a left-side road distance between each left-side road boundary in the left boundary information and the reference position information is calculated, a shortest left-side road distance is used as a left-side road boundary width, a right-side road distance between each right-side road boundary in the right boundary information and the reference position information is calculated, and a shortest right-side road distance is used as a right-side road boundary width. If the reference position information is the position of the central point of the unmanned vehicle, the distance between the left road boundary or the right road boundary and the reference position information can be calculated according to a point-to-line segment calculation mode; if the reference position information is the position of the center line segment of the unmanned vehicle, the distance between the left road boundary or the right road boundary and the reference position information can be calculated according to the calculation mode from the line segment to the line segment. Preferably, the road boundary width is calculated by using the center line segment position of the unmanned vehicle as reference position information, so that the calculation of the road boundary width is more accurate.
The embodiment of the invention determines the reference position information based on the current position information by acquiring the current position information; the target boundary information associated with the reference position information is determined from the pre-constructed axial alignment surrounding rectangular tree, the target boundary information is determined through the pre-constructed axial symmetry surrounding rectangular tree, the searching process for determining the target boundary information is simplified, the road boundary width is determined according to the target boundary information and the reference position information, the calculated amount of the road boundary width is simplified, the accuracy of the road boundary width is improved, and the accuracy of the road boundary detection result is ensured on the basis of improving the real-time performance of the road boundary detection.
Example two
Fig. 2 is a flowchart of a road boundary detection method according to a second embodiment of the present invention. The present embodiment is further optimized based on the above embodiments. As shown in fig. 2, the method includes:
s210, acquiring target boundary points of the road boundary, and generating a road boundary line segment based on the adjacent target boundary points.
In this embodiment, an axis-aligned bounding rectangular tree corresponding to a road boundary line segment is constructed according to a road boundary in a map, so as to reduce the calculation amount during road boundary width calculation and improve the real-time performance of road boundary detection. Optionally, the operation of constructing the axis-aligned bounding rectangular tree may be performed before performing real-time detection on the road boundary, or a new axis-aligned bounding rectangular tree may be constructed according to the updated map information after updating the map information, which is not limited herein. Optionally, the target boundary points of the road boundary may be obtained from the map, adjacent discrete target boundary points in all the road boundaries are connected to form a road boundary line segment, and the target boundary points are encoded to obtain the encoding of the road boundary line segment.
In one embodiment, obtaining target boundary points of a road boundary comprises: and acquiring original boundary points of the road boundary, and performing data preprocessing on the original boundary points to obtain target boundary points. Optionally, when the map is a high-precision map, the density of the original boundary points in the map is high, and the calculation amount is large by directly generating the road boundary line segments according to the original boundary points in the map and constructing the axisymmetric bounding rectangle tree. In order to simplify the calculation amount, the original boundary points in the map may be screened to obtain target boundary points, and road boundary line segments are constructed based on the target boundary points obtained after screening. Optionally, the data preprocessing performed on the original boundary point may be a data deduplication operation performed on the original boundary point, so as to obtain a simplified target boundary point capable of accurately representing the road trend.
S220, constructing an axis-aligned bounding rectangle corresponding to each road boundary line segment, and constructing an axis-aligned bounding rectangle tree based on the axis-aligned bounding rectangles.
In this embodiment, after generating the road boundary line segments, an Axis Alignment Bounding rectangle (AABB) corresponding to each road boundary line segment is constructed, an AABB is constructed based on the vertices of the encoded line segments, and then the AABB of the line segments is constructed into a binary tree to obtain an AABB tree. The AABB tree is constructed, so that the query time of the range based on the line segments can be controlled within the scale of log (n), wherein n is the number of the road boundary line segments, and the query time of the target boundary information is reduced. Optionally, for each road boundary line segment, a rectangle with the road boundary line segment as a diagonal may be taken as an axisymmetric bounding rectangle corresponding to the road boundary line segment. It should be noted that, after the axisymmetric bounding rectangle tree is constructed, it is necessary to determine a rectangular coverage area corresponding to each axisymmetric bounding rectangle, so as to facilitate subsequent query of target boundary information.
In an embodiment of the present invention, constructing an axis-aligned bounding rectangle corresponding to each road boundary line segment, and constructing an axis-aligned bounding rectangle tree based on the axis-aligned bounding rectangles includes: aiming at each axis alignment bounding rectangle, creating a target leaf node corresponding to the axis alignment bounding rectangle, and determining the target level of the target leaf node; creating a target branch node based on the target level, adding a target leaf node to the target branch node, adding an associated node associated with the target branch node to the target branch node, and deleting the associated node; and connecting the target branch node with the existing node until all the axis-aligned bounding rectangles are traversed to obtain an axis-aligned bounding rectangle tree. In this embodiment, an axis-aligned bounding rectangle tree may be constructed based on an axis-aligned bounding rectangle corresponding to each road boundary line segment by using an existing axis-aligned bounding rectangle tree construction method. It can be understood that each axis-aligned bounding rectangle may be traversed to construct a node corresponding to the axis-aligned bounding rectangle, and the nodes corresponding to the axis-aligned bounding rectangles are associated according to the association relationship between the axis-aligned bounding rectangles, so as to obtain a constructed axis-aligned bounding rectangle tree.
Specifically, a leaf node corresponding to the axis-aligned bounding rectangle may be created first and used as the root of the axis-aligned bounding rectangle tree, and then a node corresponding to a subsequent axis-aligned bounding rectangle may be added to the axis-aligned bounding rectangle tree based on the current root. Assuming that the initial axis-aligned bounding rectangle is object (1) and the subsequently added axis-aligned bounding rectangle is object (2), it can be appreciated that for object (1), an axis-aligned bounding rectangle large enough to contain object (1) and object (2) has been assigned, a new leaf node is created for object (2), and appended to the new branch node; then taking the original leaf node of the object (1) and attaching the original leaf node to a new branch node; and finally, taking the new branch node as the root of the tree. When a new object (3) is added again, a new branch node is created for the tree and is assigned an axis-aligned bounding rectangle containing the object (1) and the object (3); then creating a new leaf node for the object (3) and assigning it to the branch (B); then, the leaf node (1) is moved to be a child node of the branch node (B), and a new branch node (B) and a branch node (A) are added; finally, the axis-aligned bounding rectangle assigned to the branch node (A) is adjusted so that it takes into account the new leaf node, so that the axis-aligned bounding rectangle assigned to the branch node is no longer large enough to contain the axis-aligned bounding rectangles of its descendants.
Basically every time an object is added to the axis-aligned bounding rectangular tree, the relevant operations are performed on the tree so that the rules of the branch and root nodes described above still apply. To summarize, when an object is added to a tree, leaf nodes are created for the object first, and axis-aligned bounding rectangles are allocated for the object according to the associated object of the object; secondly, searching the best existing node (leaf or branch) in the tree to enable the new leaf to be the same level; then creating a new branch node for the positioning node and the new leaf, and allocating an axis-aligned bounding rectangle containing two nodes to the new branch node (basically combining the axis-aligned bounding rectangles of the two positioning nodes and the new leaf); then, attaching a new leaf to the new branch node, deleting the existing node from the tree and attaching it to the new branch node; then, the new branch node is used as a child node of a previous father node of the existing node for connection; finally, returning to the tree, all axis-aligned bounding rectangles are adjusted to ensure that they still contain the axis-aligned bounding rectangles of all their descendants. In the above process, how to find the best existing node in the tree is particularly important. Essentially, the above process involves dropping the tree and evaluating the possible costs attached to the left or right side of each branch. The better the decision, the more balanced the tree, and the lower the cost of subsequent queries. Among them, a common way is to allocate costs to the surface areas of the left and right nodes that have been adjusted to the new leaf and that enclose the rectangular adjustment, and then to descend toward the node with the lowest cost until the leaf node is reached.
And S230, acquiring current position information, and determining reference position information based on the current position information.
And S240, determining target boundary information associated with the reference position information from the pre-constructed axis-aligned bounding rectangle tree.
And S250, determining the width of the road boundary according to the target boundary information and the reference position information.
The method and the device have the advantages that on the basis of the above embodiments, the operation of constructing the axis alignment bounding rectangle tree is added, the target boundary points of the road boundary are obtained, the road boundary line segments are generated on the basis of the adjacent target boundary points, the axis alignment bounding rectangle corresponding to each road boundary line segment is constructed, and the axis alignment bounding rectangle tree is constructed on the basis of the axis alignment bounding rectangles, so that the query time based on the line segments is controlled within a certain scale, the determination time of the target boundary information during the road boundary detection is reduced, and the real-time performance of the road boundary information is improved.
EXAMPLE III
Fig. 3a is a flowchart of a road boundary detection method according to a third embodiment of the present invention. The present embodiment provides a preferred embodiment based on the above-described embodiments. As shown in fig. 3a, the method comprises:
s310, a road boundary initialization stage.
In the present embodiment, the high-precision map boundary data is preprocessed through a road boundary initialization stage. Optionally, the road boundary initialization stage includes three steps of road boundary encoding, constructing an axis-aligned bounding rectangle, and constructing a binary tree from the axis-aligned bounding rectangle. Specifically, the method comprises the following steps:
(1) the road boundaries are encoded. After the discretization road boundary is obtained from the map, the adjacent discretization boundary points in all the boundaries are encoded to form a line segment set. Fig. 3b is a schematic diagram of a discretized road boundary according to a third embodiment of the present invention. Fig. 3c is a schematic diagram of a road side boundary segment coding according to a third embodiment of the present invention. As shown in fig. 3b, the discretized road boundary is composed of discrete points. As shown in fig. 3c, adjacent discrete points are connected, and the encoded boundary line segment as shown in fig. 3c can be obtained based on the encoding of the discrete boundary points.
(2) An axis-aligned bounding rectangle is constructed. And constructing the AABB based on two points of (max _ x, max _ y) and (min _ x, min _ y) of the coded line segment. Fig. 3d is a schematic diagram of an axis-aligned bounding rectangle according to a third embodiment of the present invention. As shown in fig. 3d, a rectangle having a road boundary line segment as a diagonal line is taken as an axis-aligned bounding rectangle corresponding to the road boundary line segment.
(3) The AABB is constructed as a binary tree. In order to control the data processing scale well, the AABB of the line segment can be constructed into a binary tree, so that the range query time based on the line segment is controlled within the scale of log (n). Fig. 3e is a schematic diagram of an axis-aligned bounding rectangular tree according to a third embodiment of the present invention. As shown in fig. 3e, each node in the graph represents a road boundary line segment.
And S320, a road boundary inquiry stage.
In the embodiment, the unmanned vehicle obtains the width of the boundary of the surrounding road in real time according to the vehicle model in the road boundary query stage. Optionally, the road boundary query stage includes three steps of determining a target boundary line segment, classifying the target boundary line segment, and calculating a boundary width. Specifically, the method comprises the following steps:
(1) a target boundary line segment is determined. The vehicle is abstracted into a line segment legoAnd inquiring all AABB representing line segments within a certain radius range in the AABB tree by taking the center points of the line segments as centers. Fig. 3f is a schematic diagram of determining a target boundary line segment according to a third embodiment of the present invention. As shown in fig. 3f, the left arrow represents a line segment to which the vehicle is abstracted, and the circle centered on the arrow represents the query range. The left side of fig. 3f contains all road boundary line segments, and the right side of fig. 3f shows the target boundary line segments obtained by query.
(2) And (5) classifying the target boundary line segment. Dividing the target boundary line segment into left boundary lines according to the vehicle continuous segment coordinates and the target boundary line segment coordinatesleftAnd right boundary linesrightTwo types are provided.
Lines as shown in FIG. 3fleft={lines1_7,lines1_8,lines1_9I.e. the left border comprises road border line segments 1_7, 1_8 and 1_ 9. linesright={lines2_6,lines2_7,lines2_8,lines2_9,lines2_10,lines2_11I.e., the right boundary includes road boundary line segments 2_6, 2_7, 2_8, 2_9, 2_10, and 2_ 11.
(3) And calculating the width of the boundary. Respectively calculating vehicle line segments legoShortest distance to the set of left and right road boundaries. Optionally, the distance from line segment to line segment may be defined as:
f(l1,l2)=min(D(l1,l2 s),D(l1,l2 e),D(l2,l1 s),D(l2,l1 e))
wherein D represents the distance from point to line segment, lsIndicates the starting point of the line segment, leIndicating the end of the line segment.
On the basis of the above, can be according to distleft=f(lego,linesleft) Obtaining a left road boundary width distleftAccording to distright=f(lego,linesright) Obtaining the right road boundary width distright
The embodiment of the invention ensures that the processing scale of data is within a certain specification by dispersing the road boundary into line segments to construct the axis-aligned surrounding rectangular tree, reduces the query time of the range based on the line segments, calculates the width of the road boundary by abstracting vehicles into the line segments and calculating the mathematical model of the road boundary width from the line segments to the line segments, and ensures that the calculation of the width of the road boundary is more accurate.
Example four
Fig. 4 is a schematic structural diagram of a road boundary detection apparatus according to a fourth embodiment of the present invention. The road boundary detection device may be implemented in software and/or hardware, for example, the road boundary detection device may be configured in an unmanned vehicle. As shown in fig. 4, the apparatus includes a reference position determining module 410, an object boundary determining module 420, and a boundary width determining module 430, wherein:
a reference position determining module 410, configured to obtain current position information, and determine reference position information for road boundary detection based on the current position information;
a target boundary determining module 420 for determining target boundary information associated with the reference position information from a pre-constructed axis-aligned bounding rectangular tree;
and a boundary width determining module 430, configured to determine a road boundary width according to the target boundary information and the reference position information.
The embodiment of the invention obtains the current position information through the reference position determining module, and determines the reference position information of road boundary detection based on the current position information; the target boundary determining module determines target boundary information associated with the reference position information from a pre-constructed axis alignment surrounding rectangular tree, the target boundary information is determined through the pre-constructed axis symmetry surrounding rectangular tree, the searching process for determining the target boundary information is simplified, the boundary width determining module determines the road boundary width according to the target boundary information and the reference position information, the calculated amount of the road boundary width is simplified, the accuracy of the road boundary width is improved, and the accuracy of the road boundary detection result is ensured on the basis of improving the real-time performance of the road boundary detection.
Optionally, on the basis of the foregoing scheme, the target boundary determining module 420 is specifically configured to:
determining a reference matching range according to the reference position information;
acquiring a rectangular coverage area corresponding to each axis alignment bounding rectangle in the axis alignment bounding rectangle tree, and taking the axis alignment bounding rectangle corresponding to the rectangular coverage area with intersection with the reference matching range as a target axis alignment bounding rectangle;
and taking the road boundary line segment corresponding to the target axis alignment bounding rectangle as target boundary information.
Optionally, on the basis of the foregoing scheme, the boundary width determining module 430 includes:
the boundary information dividing unit is used for determining left boundary information and right boundary information in the target boundary information according to the boundary position information and the reference position information of the target boundary information;
and the boundary width calculation unit is used for determining the width of the left road boundary according to the left boundary information and the reference position information and determining the width of the right road boundary according to the right boundary information and the reference position information.
Optionally, on the basis of the foregoing scheme, the boundary width calculating unit is specifically configured to:
calculating the left road distance between each left boundary information and the reference position information, and taking the shortest left road distance as the width of the left road boundary;
and calculating the right road distance between each piece of right boundary information and the reference position information, and taking the shortest right road distance as the width of the right road boundary.
Optionally, on the basis of the above scheme, the method further includes an axis-aligned bounding rectangle tree building module, configured to:
acquiring target boundary points of a road boundary, and generating a road boundary line segment based on adjacent target boundary points;
and constructing an axis-aligned bounding rectangle corresponding to each road boundary line segment, and constructing an axis-aligned bounding rectangle tree based on the axis-aligned bounding rectangles.
Optionally, on the basis of the above scheme, the axis-aligned bounding rectangle tree building module is specifically configured to:
for each axis-aligned bounding rectangle, creating a target leaf node corresponding to the axis-aligned bounding rectangle, and determining a target level of the target leaf node;
creating a target branch node based on the target level, adding the target leaf node to the target branch node, adding an associated node associated with the target branch node to the target branch node, and deleting the associated node;
and connecting the target branch node with the existing node until all the axis-aligned bounding rectangles are traversed to obtain the axis-aligned bounding rectangle tree.
Optionally, on the basis of the above scheme, the axis-aligned bounding rectangle tree building module is specifically configured to:
and acquiring original boundary points of the road boundary, and performing data preprocessing on the original boundary points to obtain target boundary points.
The road boundary detection device provided by the embodiment of the invention can execute the road boundary detection method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
EXAMPLE five
Fig. 5 is a schematic structural diagram of an unmanned vehicle according to a fifth embodiment of the present invention. Fig. 5 illustrates a block diagram of an exemplary drone 512 suitable for use in implementing embodiments of the present invention. The unmanned vehicle 512 shown in fig. 5 is only an example and should not bring any limitation to the function and the scope of use of the embodiment of the present invention.
As shown in fig. 5, the unmanned vehicle 512 is in the form of a general purpose computing device. Components of the unmanned vehicle 512 may include, but are not limited to: one or more processors 516, a system memory 528, and a bus 518 that couples the various system components including the system memory 528 and the processors 516. Optionally, the unmanned vehicle 512 further includes a sensor such as a GPS, which is used to determine the position information of the unmanned vehicle.
Bus 518 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and processor 516, or a local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
The drone vehicle 512 typically includes a variety of computer system readable media. These media may be any available media that can be accessed by the drone 512 and includes both volatile and non-volatile media, removable and non-removable media.
The system memory 528 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)530 and/or cache memory 532. The drone 512 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage 534 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 5, and commonly referred to as a "hard drive"). Although not shown in FIG. 5, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 518 through one or more data media interfaces. Memory 528 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 540 having a set (at least one) of program modules 542, including but not limited to an operating system, one or more application programs, other program modules, and program data, may be stored in, for example, the memory 528, each of which examples or some combination may include an implementation of a network environment. The program modules 542 generally perform the functions and/or methods of the described embodiments of the invention.
The drone vehicle 512 may also communicate with one or more external devices 514 (e.g., keyboard, pointing device, display 524, etc.), with one or more devices that enable a user to interact with the drone vehicle 512, and/or with any devices (e.g., network card, modem, etc.) that enable the drone vehicle 512 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 522. Also, the drone 512 may also communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 520. As shown, the network adapter 520 communicates with the other modules of the drone 512 via a bus 518. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the drone 512, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processor 516 executes various functional applications and data processing by running a program stored in the system memory 528, for example, implementing a road boundary detection method provided by an embodiment of the present invention, the method including:
acquiring current position information, and determining reference position information of road boundary detection based on the current position information;
determining target boundary information associated with the reference position information from a pre-constructed axis-aligned bounding rectangular tree;
and determining the width of the road boundary according to the target boundary information and the reference position information.
Of course, those skilled in the art can understand that the processor may also implement the technical solution of the road boundary detection method provided by any embodiment of the present invention.
EXAMPLE six
The sixth embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements a road boundary detection method provided in the sixth embodiment of the present invention, where the method includes:
acquiring current position information, and determining reference position information of road boundary detection based on the current position information;
determining target boundary information associated with the reference position information from a pre-constructed axis-aligned bounding rectangular tree;
and determining the width of the road boundary according to the target boundary information and the reference position information.
Of course, the computer program stored on the computer-readable storage medium provided by the embodiments of the present invention is not limited to the method operations described above, and may also perform operations related to the road boundary detection method provided by any embodiments of the present invention.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A road boundary detection method is characterized by comprising the following steps:
acquiring current position information, and determining reference position information of road boundary detection based on the current position information;
determining target boundary information associated with the reference position information from a pre-constructed axis-aligned bounding rectangular tree;
and determining the width of the road boundary according to the target boundary information and the reference position information.
2. The method of claim 1, wherein determining the target boundary information associated with the reference position information from a pre-constructed axis-aligned bounding rectangle tree comprises:
determining a reference matching range according to the reference position information;
acquiring a rectangular coverage area corresponding to each axis-aligned bounding rectangle in the axis-aligned bounding rectangle tree, and taking the axis-aligned bounding rectangle corresponding to the rectangular coverage area with intersection with the reference matching range as a target axis-aligned bounding rectangle;
and taking the road boundary line segment corresponding to the target axis alignment bounding rectangle as the target boundary information.
3. The method of claim 1, wherein determining a road boundary width from the target boundary information and the reference position information comprises:
determining left boundary information and right boundary information in the target boundary information according to the boundary position information of the target boundary information and the reference position information;
and determining the width of the left road boundary according to the left boundary information and the reference position information, and determining the width of the right road boundary according to the right boundary information and the reference position information.
4. The method of claim 3, wherein determining a left road boundary width based on the left boundary information and the reference position information and determining a right road boundary width based on the right boundary information and the reference position information comprises:
calculating a left-side road distance between each piece of left boundary information and the reference position information, and taking the shortest left-side road distance as the width of the left-side road boundary;
and calculating the right road distance between each piece of right boundary information and the reference position information, and taking the shortest right road distance as the width of the right road boundary.
5. The method of claim 1, further comprising:
acquiring target boundary points of a road boundary, and generating a road boundary line segment based on the adjacent target boundary points;
and constructing an axis-aligned bounding rectangle corresponding to each road boundary line segment, and constructing an axis-aligned bounding rectangle tree based on the axis-aligned bounding rectangles.
6. The method of claim 5, wherein the constructing of the axis-aligned bounding rectangle for each of the road boundary line segments, the constructing of the tree of axis-aligned bounding rectangles based on the axis-aligned bounding rectangles, comprises:
for each axis-aligned bounding rectangle, creating a target leaf node corresponding to the axis-aligned bounding rectangle, and determining a target level of the target leaf node;
creating a target branch node based on the target level, adding the target leaf node to the target branch node, adding an associated node associated with the target branch node to the target branch node, and deleting the associated node;
and connecting the target branch node with the existing node until all the axis-aligned bounding rectangles are traversed to obtain the axis-aligned bounding rectangle tree.
7. The method of claim 5, wherein the obtaining target coordinates of the road boundary comprises:
and acquiring original boundary points of the road boundary, and performing data preprocessing on the original boundary points to obtain the target boundary points.
8. A road boundary detection device, characterized by, includes:
the reference position determining module is used for acquiring current position information and determining reference position information of road boundary detection based on the current position information;
a target boundary determining module, configured to determine target boundary information associated with the reference position information from a pre-constructed axis-aligned bounding rectangle tree;
and the boundary width determining module is used for determining the road boundary width according to the target boundary information and the reference position information.
9. An unmanned vehicle, comprising:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the road boundary detection method of any one of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the road boundary detection method according to any one of claims 1 to 7.
CN202010538071.2A 2020-06-12 2020-06-12 Road boundary detection method, device, unmanned vehicle and storage medium Active CN113761990B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010538071.2A CN113761990B (en) 2020-06-12 2020-06-12 Road boundary detection method, device, unmanned vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010538071.2A CN113761990B (en) 2020-06-12 2020-06-12 Road boundary detection method, device, unmanned vehicle and storage medium

Publications (2)

Publication Number Publication Date
CN113761990A true CN113761990A (en) 2021-12-07
CN113761990B CN113761990B (en) 2024-04-09

Family

ID=78785381

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010538071.2A Active CN113761990B (en) 2020-06-12 2020-06-12 Road boundary detection method, device, unmanned vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN113761990B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050209748A1 (en) * 2004-03-12 2005-09-22 Toyota Jidosha Kabushiki Kaisha Lane boundary detector
US20090296987A1 (en) * 2008-05-27 2009-12-03 Toyota Jidosha Kabushiki Kaisha Road lane boundary detection system and road lane boundary detecting method
JP2011243161A (en) * 2010-05-21 2011-12-01 Denso Corp Lane boundary detection apparatus and lane boundary detection program
US20160153152A1 (en) * 2014-11-28 2016-06-02 International Business Machines Corporation Method and apparatus for determining a road network partitioning border line
CN111095291A (en) * 2018-02-27 2020-05-01 辉达公司 Real-time detection of lanes and boundaries by autonomous vehicles
CN111104410A (en) * 2019-12-25 2020-05-05 北京经纬恒润科技有限公司 Local road information extraction method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050209748A1 (en) * 2004-03-12 2005-09-22 Toyota Jidosha Kabushiki Kaisha Lane boundary detector
US20090296987A1 (en) * 2008-05-27 2009-12-03 Toyota Jidosha Kabushiki Kaisha Road lane boundary detection system and road lane boundary detecting method
JP2011243161A (en) * 2010-05-21 2011-12-01 Denso Corp Lane boundary detection apparatus and lane boundary detection program
US20160153152A1 (en) * 2014-11-28 2016-06-02 International Business Machines Corporation Method and apparatus for determining a road network partitioning border line
CN111095291A (en) * 2018-02-27 2020-05-01 辉达公司 Real-time detection of lanes and boundaries by autonomous vehicles
CN111104410A (en) * 2019-12-25 2020-05-05 北京经纬恒润科技有限公司 Local road information extraction method and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JIANMIN D.等: "Real time road edges detection and road signs recognition", 2015 INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND INFORMATION SCIENCES (ICCAIS) *
徐杰, 李晓虎, 王荣本, 施鹏飞: "车辆自主导航中的道路边界识别算法", 中国图象图形学报, no. 06 *

Also Published As

Publication number Publication date
CN113761990B (en) 2024-04-09

Similar Documents

Publication Publication Date Title
JP7082151B2 (en) Map trajectory matching data quality determination method, equipment, server and medium
US10627520B2 (en) Method and apparatus for constructing reflectance map
WO2015078238A1 (en) Dispatching map matching tasks by cluster server in internet of vehicles
CN107883974B (en) Navigation path planning method, navigation server and computer readable medium
CN109558854B (en) Obstacle sensing method and device, electronic equipment and storage medium
CN110677815A (en) Stay point identification method and device, computer equipment and storage medium
CN110553658B (en) Navigation path recommendation method, navigation server, computer device and readable medium
CN110006439B (en) Map track data matching method, map track data matching device, server and storage medium
EP3794312B1 (en) Indoor location-based service
WO2018058888A1 (en) Street view image recognition method and apparatus, server and storage medium
CN110647675B (en) Method and device for recognition of stop point and training of prediction model and storage medium
CN113010793A (en) Method, device, equipment, storage medium and program product for map data processing
CN110542425B (en) Navigation path selection method, navigation device, computer equipment and readable medium
US9910878B2 (en) Methods for processing within-distance queries
CN110555432B (en) Method, device, equipment and medium for processing interest points
CN111354217A (en) Parking route determining method, device, equipment and medium
CN115510175A (en) Method and device for converting geographical coordinates of dwg data, computer equipment and medium
CN113761990B (en) Road boundary detection method, device, unmanned vehicle and storage medium
CN114578401B (en) Method and device for generating lane track points, electronic equipment and storage medium
CN114136327B (en) Automatic checking method and system for recall ratio of broken line segment
CN111581471B (en) Regional vehicle checking method, device, server and medium
CN113139032A (en) Geographic position searching method and device, electronic equipment and storage medium
CN114840539A (en) Data processing method, device, equipment and storage medium
CN110659312B (en) Data processing method, device, equipment and computer storage medium
US20160356608A1 (en) Map-matching by dual-level heuristic search

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant