CN113761990B - Road boundary detection method, device, unmanned vehicle and storage medium - Google Patents

Road boundary detection method, device, unmanned vehicle and storage medium Download PDF

Info

Publication number
CN113761990B
CN113761990B CN202010538071.2A CN202010538071A CN113761990B CN 113761990 B CN113761990 B CN 113761990B CN 202010538071 A CN202010538071 A CN 202010538071A CN 113761990 B CN113761990 B CN 113761990B
Authority
CN
China
Prior art keywords
boundary
target
position information
road
axis alignment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010538071.2A
Other languages
Chinese (zh)
Other versions
CN113761990A (en
Inventor
陈建兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Qianshi Technology Co Ltd
Original Assignee
Beijing Jingdong Qianshi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Qianshi Technology Co Ltd filed Critical Beijing Jingdong Qianshi Technology Co Ltd
Priority to CN202010538071.2A priority Critical patent/CN113761990B/en
Publication of CN113761990A publication Critical patent/CN113761990A/en
Application granted granted Critical
Publication of CN113761990B publication Critical patent/CN113761990B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Navigation (AREA)

Abstract

The embodiment of the invention discloses a road boundary detection method, a device, an unmanned vehicle and a storage medium, wherein the method comprises the following steps: acquiring current position information, and determining reference position information of road boundary detection based on the current position information; determining target boundary information associated with the reference position information from a pre-built axis alignment bounding rectangular tree; and determining the road boundary width according to the target boundary information and the reference position information. The road boundary detection method provided by the embodiment of the invention detects the road boundary of the current running through the pre-constructed axis alignment surrounding rectangular tree and the current position information, thereby realizing the purpose of ensuring the accuracy of the road boundary detection result on the basis of improving the real-time performance of road boundary detection.

Description

Road boundary detection method, device, unmanned vehicle and storage medium
Technical Field
The embodiment of the invention relates to the technical field of boundary detection, in particular to a road boundary detection method, a road boundary detection device, an unmanned vehicle and a storage medium.
Background
In an automated driving process of an unmanned vehicle, the unmanned vehicle generally needs to provide a range of route planning for the unmanned vehicle based on the boundary of the surrounding road by acquiring the boundary of the surrounding road from a map. There are two methods for calculating the width of the road boundary: 1. directly acquiring the road width in the map; 2. the shortest distance from the "point to the line segment" is used as the boundary width.
In the process of implementing the present invention, the inventor finds that at least the following technical problems exist in the prior art: the accuracy of directly acquiring the road width in the map is poor, one road is always one width, and the accuracy required by route planning is obviously not met; although the problem of accuracy is relieved by adopting the shortest distance from point to line segment as the calculation mode of the boundary width, the real-time performance is poor due to the searching attribute, the calculation efficiency is low, and the result obtained by the calculation model in some areas with large boundary morphology change does not meet the traffic requirement of unmanned vehicles.
Disclosure of Invention
The embodiment of the invention provides a road boundary detection method, a device, an unmanned vehicle and a storage medium, solves the technical problems of inaccurate road boundary detection and poor real-time performance in the prior art, and ensures the accuracy of a road boundary detection result on the basis of improving the real-time performance of road boundary detection.
In a first aspect, an embodiment of the present invention provides a road boundary detection method, including:
acquiring current position information, and determining reference position information of road boundary detection based on the current position information;
determining target boundary information associated with the reference position information from a pre-built axis alignment bounding rectangular tree;
And determining the road boundary width according to the target boundary information and the reference position information.
In a second aspect, an embodiment of the present invention further provides a road boundary detection apparatus, including:
the reference position determining module is used for acquiring current position information and determining reference position information of road boundary detection based on the current position information;
a target boundary determination module for determining target boundary information associated with the reference position information from a pre-built axis alignment bounding rectangular tree;
and the boundary width determining module is used for determining the road boundary width according to the target boundary information and the reference position information.
In a third aspect, an embodiment of the present invention further provides an unmanned vehicle, where the unmanned vehicle includes:
one or more processors;
storage means for storing one or more programs
The one or more programs, when executed by the one or more processors, cause the one or more processors to implement the road boundary detection method as provided by any embodiment of the present invention.
In a sixth aspect, an embodiment of the present invention further provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the road boundary detection method as provided by any embodiment of the present invention.
According to the embodiment of the invention, the current position information is acquired, and the reference position information of road boundary detection is determined based on the current position information; the method has the advantages that the target boundary information related to the reference position information is determined from the pre-built axisymmetric surrounding rectangular tree, the target boundary information is determined through the pre-built axisymmetric surrounding rectangular tree, the searching process for determining the target boundary information is simplified, the road boundary width is determined according to the target boundary information and the reference position information, the calculated amount of the road boundary width is simplified, the accuracy of the road boundary width is improved, and the accuracy of the road boundary detection result is ensured on the basis of improving the real-time performance of the road boundary detection.
Drawings
FIG. 1 is a flowchart of a road boundary detection method according to an embodiment of the present invention;
fig. 2 is a flowchart of a road boundary detection method according to a second embodiment of the present invention;
fig. 3a is a flowchart of a road boundary detection method according to a third embodiment of the present invention;
FIG. 3b is a diagram of a discretized roadway boundary according to a third embodiment of the present invention;
FIG. 3c is a schematic diagram of encoding a road boundary segment according to a third embodiment of the present invention;
FIG. 3d is a schematic view of an axis alignment bounding rectangle provided by a third embodiment of the present invention;
FIG. 3e is a schematic view of an axis aligned bounding rectangular tree provided by a third embodiment of the present invention;
FIG. 3f is a schematic diagram illustrating a determination of a boundary line segment of a target according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of a road boundary detecting device according to a fourth embodiment of the present invention;
fig. 5 is a schematic structural diagram of an unmanned vehicle according to a fifth embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present invention are shown in the drawings.
Example 1
Fig. 1 is a flowchart of a road boundary detection method according to an embodiment of the invention. The embodiment is applicable to the situation when detecting the road boundary, and is particularly applicable to the situation when detecting the road boundary of the current driving road of the unmanned vehicle. The method may be performed by a road-boundary detection device, which may be implemented in software and/or hardware, e.g. which may be deployed in an unmanned vehicle. As shown in fig. 1, the method includes:
S110, acquiring current position information, and determining reference position information of road boundary detection based on the current position information.
In this embodiment, the current location information may be current location information of the unmanned vehicle. Alternatively, the current position information of the unmanned vehicle may be determined by a global positioning system (Global Positioning System, GPS) sensor in the unmanned vehicle, and after the current position information of the unmanned vehicle is acquired, the reference position information of the road boundary detection is determined based on the current position information of the unmanned vehicle, wherein the reference position information of the road boundary detection represents a calculated reference position of the road boundary width, and is used for determining the matching range of the road boundary.
In the present embodiment, the form of the reference position information is not limited here. Alternatively, the reference position information may be a center point position of the unmanned vehicle, or may be a center line segment position of the unmanned vehicle. Preferably, the reference position information is a vehicle position line segment of the target vehicle. The target vehicle is an unmanned vehicle, namely, the reference position information is the position of a central line segment of the unmanned vehicle. The center line segment position of the unmanned vehicle is used as the reference position information to calculate the road boundary width, so that the distance between each part of the unmanned vehicle and the road boundary is considered in the calculation of the road boundary width, and the calculation of the road boundary width is more reasonable and accurate.
S120, determining target boundary information associated with the reference position information from a pre-built axis alignment bounding rectangular tree.
The method aims at solving the technical problem of poor real-time performance caused by the fact that the distance between each road boundary line segment and the reference position information needs to be calculated in the prior art. In the present embodiment, by constructing the road boundary in the map in advance as an axis alignment bounding rectangular tree (Axis Alignment Bounding Box tree, AABB tree), a partial road boundary line segment is selected from all road boundary line segments as target boundary information based on the axis alignment bounding rectangular tree constructed in advance, and the road boundary width is calculated based on the target boundary information. And compared with the calculation method for calculating the road boundary widths corresponding to all the road boundary line segments in the prior art, the calculation method for selecting part of the road boundary line segments to calculate the road boundary widths reduces the calculation amount of the road boundary widths and improves the real-time performance of road boundary detection.
Alternatively, the target boundary information associated with the reference position information may be road boundary information within a reference matching range of a set size corresponding to the reference position information. In one embodiment of the present invention, determining target boundary information associated with reference position information from a pre-built axis alignment bounding rectangular tree includes: determining a reference matching range according to the reference position information; acquiring rectangular coverage areas corresponding to all the axis alignment surrounding rectangles in the axis alignment surrounding rectangle tree, and taking the axis alignment surrounding rectangle corresponding to the rectangular coverage area intersected with the reference matching range as a target axis alignment surrounding rectangle; and taking the road boundary line segment corresponding to the target axis alignment bounding rectangle as target boundary information.
The query method of the AABB tree provides a function of determining the line segment associated with the designated area according to the designated area. In this embodiment, the above-described query function of the AABB tree is utilized to find out, with the reference matching range as the designated area, a road boundary line segment associated with the reference matching range in the AABB tree as target boundary information. Specifically, a rectangular coverage area corresponding to each axis alignment bounding rectangle in the AABB tree is obtained, and a road boundary line segment corresponding to an axisymmetric bounding rectangle having an intersection with the reference matching range is taken as target boundary information. The current position information, the reference matching range, the rectangular coverage area, and the like referred to above may be determined with reference to the map coordinate system.
Alternatively, the reference matching range may be determined based on the observable range of the drone. For example, the set radius of the reference matching range may be determined from the observable range of the drone. After the reference position information is determined, the center of the reference matching range is determined based on the reference position, and a circular area with a radius set to be the radius is set as the reference matching range with the center point as the center of the circle. If the reference position information is the center point position of the unmanned vehicle, the reference position information can be directly used as the center of the reference matching range; if the reference position information is the center line segment position of the unmanned vehicle, the center point of the center line segment position may be set as the center of the reference matching range.
S130, determining the road boundary width according to the target boundary information and the reference position information.
In the present embodiment, after the target boundary information is determined, the road boundary width may be calculated from the target boundary information and the reference position information. The calculation mode of the road boundary width can be set according to actual requirements. Alternatively, the total width of the road boundary, i.e., the width between the left road boundary and the right road boundary, may be calculated according to the target boundary information; the width of the left road boundary of the unmanned vehicle from the left road boundary and the width of the right road boundary from the right road boundary can be calculated according to the reference position information and the target boundary information.
In the above two calculation modes of road boundary widths, it is necessary to determine the left road boundary and the right road boundary. Alternatively, the positional relationship between the road boundary line segment and the reference position information may be determined according to the position information (e.g., line segment coordinates) and the reference position information (e.g., reference coordinates) of each road boundary line segment in the target boundary information, where the road boundary line segment located on the left side of the reference position information is used as the left road boundary, and the road boundary line segment located on the right side of the reference position information is used as the right road boundary. After the left road boundary and the right road boundary are determined, the total width of the road boundary or the left road boundary width and the right road boundary width can be calculated based on the left road boundary and the right road boundary.
Generally, in order to improve the accuracy of road planning, it is necessary to determine a left road side boundary width of the unmanned vehicle from the left road boundary and a right road side boundary width of the unmanned vehicle from the right road boundary. In one embodiment of the present invention, determining a road boundary width from target boundary information and reference position information includes: determining left boundary information and right boundary information in the target boundary information according to the boundary position information and the reference position information of the target boundary information; and determining the width of the left road side boundary according to the left boundary information and the reference position information, and determining the width of the right road side boundary according to the right boundary information and the reference position information. Optionally, after determining the left road boundary and the right road boundary, the left road boundary is used as left boundary information, the right road boundary is used as right boundary information, the left road boundary width is calculated according to the left boundary information and the reference position information, and the right road boundary width is calculated according to the right boundary information and the reference position information.
Optionally, determining the left road side boundary width according to the left boundary information and the reference position information, and determining the right road side boundary width according to the right boundary information and the reference position information includes: calculating left road distance between each left boundary information and the reference position information, and taking the shortest left road distance as left road boundary width; and calculating the right road distance between each right boundary information and the reference position information, and taking the shortest right road distance as the right road boundary width. Specifically, a left road distance between each left road boundary in the left boundary information and the reference position information is calculated, the shortest left road distance is used as a left road boundary width, a right road distance between each right road boundary in the right boundary information and the reference position information is calculated, and the shortest right road distance is used as a right road boundary width. If the reference position information is the center point position of the unmanned vehicle, the distance between the left road boundary or the right road boundary and the reference position information can be calculated according to the calculation mode from the point to the line segment; if the reference position information is the center line segment position of the unmanned vehicle, the distance between the left road boundary or the right road boundary and the reference position information can be calculated according to the line segment-to-line segment calculation mode. Preferably, the road boundary width is calculated by taking the position of the central line segment of the unmanned vehicle as the reference position information, so that the calculation of the road boundary width is more accurate.
According to the embodiment of the invention, the current position information is acquired, and the reference position information is determined based on the current position information; the method has the advantages that the target boundary information related to the reference position information is determined from the pre-built axisymmetric surrounding rectangular tree, the target boundary information is determined through the pre-built axisymmetric surrounding rectangular tree, the searching process for determining the target boundary information is simplified, the road boundary width is determined according to the target boundary information and the reference position information, the calculated amount of the road boundary width is simplified, the accuracy of the road boundary width is improved, and the accuracy of the road boundary detection result is ensured on the basis of improving the real-time performance of the road boundary detection.
Example two
Fig. 2 is a flowchart of a road boundary detection method according to a second embodiment of the present invention. The present embodiment is further optimized based on the above embodiments. As shown in fig. 2, the method includes:
s210, acquiring target boundary points of the road boundary, and generating a road boundary line segment based on the adjacent target boundary points.
In this embodiment, an axis alignment bounding rectangular tree corresponding to a road boundary line segment is constructed according to a road boundary in a map, so as to reduce the calculation amount in the calculation of the road boundary width and improve the real-time performance of road boundary detection. Alternatively, the operation of constructing the axis alignment bounding rectangle tree may be performed before the real-time detection of the road boundary, or a new axis alignment bounding rectangle tree may be constructed according to the updated map information after the update of the map information, which is not limited herein. Alternatively, the target boundary points of the road boundary may be obtained from the map, and adjacent discrete target boundary points in all the road boundaries are connected to form a road boundary line segment, and the target boundary points are encoded to obtain the encoding of the road boundary line segment.
In one embodiment, obtaining a target boundary point of a road boundary includes: and acquiring an original boundary point of the road boundary, and performing data preprocessing on the original boundary point to obtain a target boundary point. Alternatively, when the map is a high-precision map, the density of the original boundary points in the map is high, and the calculation amount is high by directly generating road boundary line segments according to the original boundary points in the map and constructing axisymmetric bounding rectangular trees. In order to simplify the calculation amount, the original boundary points in the map can be screened to obtain target boundary points, and the road boundary line segments are constructed based on the screened target boundary points. Optionally, the data preprocessing of the original boundary point may be performing a data deduplication operation on the original boundary point, so as to obtain a simplified target boundary point capable of accurately characterizing the road trend.
S220, constructing an axis alignment bounding rectangle corresponding to each road boundary line segment, and constructing an axis alignment bounding rectangle tree based on the axis alignment bounding rectangles.
In this embodiment, after generating the road boundary line segments, constructing an axis alignment bounding rectangle (Axis Alignment Bounding Box, AABB) corresponding to each road boundary line segment, constructing AABB based on the vertices of the encoded line segments, and then constructing AABB of the line segments into a binary tree, thereby obtaining an AABB tree. The construction of the AABB tree can control the range query time based on the line segments within the scale of log (n), wherein n is the number of the line segments of the road boundary, and the query time of the target boundary information is reduced. Alternatively, for each road boundary line segment, a rectangle with the road boundary line segment as a diagonal line may be regarded as an axisymmetric bounding rectangle corresponding to the road boundary line segment. It should be noted that, after the axisymmetric bounding rectangle tree is constructed, it is necessary to determine that each axis aligns with the rectangular coverage area corresponding to the bounding rectangle, so as to facilitate the subsequent query of the target boundary information.
In one embodiment of the present invention, constructing an axis alignment bounding rectangle corresponding to each road boundary line segment, constructing an axis alignment bounding rectangle tree based on the axis alignment bounding rectangles, includes: for each axis alignment bounding rectangle, creating a target leaf node corresponding to the axis alignment bounding rectangle, and determining a target level of the target leaf node; creating a target branch node based on the target level, adding a target leaf node to the target branch node, adding an associated node associated with the target branch node to the target branch node, and deleting the associated node; and connecting the target branch node with the existing node until all the axis alignment bounding rectangles are traversed, and obtaining an axis alignment bounding rectangle tree. In this embodiment, the axis alignment bounding rectangle tree may be constructed based on the corresponding axis alignment bounding rectangle of each road boundary line segment by the existing axis alignment bounding rectangle tree construction method. It can be understood that each axis alignment bounding rectangle can be traversed, corresponding nodes are constructed, and the nodes corresponding to the axis alignment bounding rectangles are associated according to the association relation between the axis alignment bounding rectangles, so that a constructed axis alignment bounding rectangle tree is obtained.
Specifically, a leaf node corresponding to the axis alignment bounding rectangle may be created first and used as the root of the axis alignment bounding rectangle tree, and then, based on the current root, the node corresponding to the subsequent axis alignment bounding rectangle is added to the axis alignment bounding rectangle tree. Assuming that the initial axis alignment bounding rectangle is object (1) and the subsequently added axis alignment bounding rectangle is object (2), it will be appreciated that for object (1) an axis alignment bounding rectangle large enough to contain object (1) and object (2) has been assigned, creating a new leaf node for object (2) and appending it to the new branch node; then taking the original leaf node of object (1) and appending it to the new branch node; and finally taking the new branch node as the root of the tree. When a new object (3) is added again, a new branch node is created for the tree and assigned an axis-aligned bounding rectangle containing object (1) and object (3); then create a new leaf node for object (3) and assign it to branch (B); then, the leaf node (1) is moved to be a child node of the branch node (B), and a new branch node (B) and a branch node (A) are added; finally, the axis alignment bounding rectangle assigned to the branch node (A) is adjusted so that it considers the new leaf node, so that the axis alignment bounding rectangle assigned to the branch node is no longer large enough to contain the axis alignment bounding rectangle of its offspring.
Essentially every time an object is added to the axis aligned bounding rectangular tree, a correlation operation is performed on the tree so that the rules for branches and root nodes described above remain applicable. To summarize, when an object is added to a tree, leaf nodes are first created for the object, and bounding rectangles are aligned for their allocation axes according to the associated object of the object; secondly, searching the best existing node (leaf or branch) in the tree to make the new leaf be the same level; then creating a new branch node for the positioning node and the new leaf, and assigning it an axis aligned bounding rectangle containing both nodes (basically combining the axis aligned bounding rectangle of both positioning nodes and the new leaf together); then attaching the new leaf to the new branch node, deleting the existing node from the tree and attaching it to the new branch node; then the new branch node is used as the child node of the parent node before the existing node to connect; finally, returning to the tree, all axis alignment bounding rectangles are adjusted to ensure that the axis alignment bounding rectangle still contains the axis alignment bounding rectangles of all their offspring. In the above process, it is particularly important how to find the best existing node in the tree. Essentially, the above procedure involves dropping the tree and evaluating the possible costs attached to the left or right side of each branch. The better the decision, the more balanced the tree, and the lower the cost value of the subsequent queries. In this case, it is common to allocate costs to the surface area of the left and right nodes that have been adjusted to the new leaf axis alignment surrounding the rectangular adjustment, and then to descend toward the node with the lowest cost until the leaf node is reached.
S230, acquiring current position information, and determining reference position information based on the current position information.
S240, determining target boundary information associated with the reference position information from a pre-built axis alignment bounding rectangular tree.
S250, determining the road boundary width according to the target boundary information and the reference position information.
According to the embodiment of the invention, on the basis of the embodiment, the operation of constructing the axis alignment bounding rectangle tree is added, the road boundary line segments are generated based on the adjacent target boundary points by acquiring the target boundary points of the road boundary, the axis alignment bounding rectangle corresponding to each road boundary line segment is constructed, and the axis alignment bounding rectangle tree is constructed based on the axis alignment bounding rectangles, so that the range query time based on the line segments is controlled within a certain scale, the determination time of the target boundary information during road boundary detection is reduced, and the real-time performance of the road boundary information is improved.
Example III
Fig. 3a is a flowchart of a road boundary detection method according to a third embodiment of the present invention. This embodiment provides a preferred embodiment on the basis of the above-described embodiments. As shown in fig. 3a, the method comprises:
s310, initializing a road boundary.
In the present embodiment, the high-precision map boundary data is preprocessed by the road boundary initialization stage. Optionally, the road boundary initialization stage includes three steps of road boundary encoding, constructing an axis alignment bounding rectangle, and constructing a binary tree with the axis alignment bounding rectangle. Specific:
(1) The road boundaries are encoded. After the discretized road boundaries are obtained from the map, adjacent discrete boundary points in all the boundaries are encoded to form a line segment set. Fig. 3b is a schematic view of a discretized road boundary according to a third embodiment of the present invention. Fig. 3c is a schematic diagram of encoding a road boundary segment according to a third embodiment of the present invention. As shown in fig. 3b, the discretized road boundary consists of discrete points. As shown in fig. 3c, adjacent discrete points are connected, and the encoded boundary line segments as shown in fig. 3c can be obtained based on the encoding of the discrete boundary points.
(2) Building an axis alignment surrounds the rectangle. AABB is constructed based on two points of (max_x, max_y) and (min_x, min_y) of the encoded line segment. Fig. 3d is a schematic view of an axis alignment bounding rectangle provided by a third embodiment of the present invention. As shown in fig. 3d, a rectangle having a road boundary line segment as a diagonal line is aligned around the rectangle as an axis corresponding to the road boundary line segment.
(3) Constructing the AABB into a binary tree. In order to control the processing scale of the data well, the AABB of the line segments can be constructed into a binary tree, so that the range query time based on the line segments is controlled within the scale of log (n). Fig. 3e is a schematic view of an axis alignment bounding rectangular tree according to a third embodiment of the present invention. As shown in fig. 3e, each node in the graph represents a road boundary segment.
S320, a road boundary query stage.
In this embodiment, the unmanned vehicle obtains the surrounding road boundary width in real time according to the vehicle model through the road boundary query stage. Optionally, the road boundary query stage includes three steps of determining a target boundary line segment, classifying the target boundary line segment, and calculating a boundary width. Specific:
(1) A target boundary line segment is determined. In particular, abstracting the vehicle into a line segment l ego And taking the center point of the line segment as the center, and querying all AABBs representing the line segment within a certain radius range in the AABB tree. Fig. 3f is a schematic diagram of determining a boundary line segment of a target according to a third embodiment of the present invention. As shown in fig. 3f, the left arrow represents a line segment that the vehicle is abstracted to, and a circle centered on the arrow represents the query range. All road boundary line segments are included in the left diagram of fig. 3f, and the target boundary line segments obtained by query are shown in the right diagram of fig. 3 f.
(2) And classifying the target boundary line segments. Dividing the target boundary line segment into left boundary lines according to the vehicle continuous coordinate and the target boundary line segment coordinate left And right boundary lines right Two types.
As shown in FIG. 3f, lines left ={lines 1_7 ,lines 1_8 ,lines 1_9 I.e., the left boundary includes road boundary line segments 1_7, 1_8, and 1_9.lines right ={lines 2_6 ,lines 2_7 ,lines 2_8 ,lines 2_9 ,lines 2_10 ,lines 2_11 I.e. the right boundary comprises road boundary line segments 2_6, 2_7, 2_8, 2_9, 2_10, and 2_11.
(3) And calculating the boundary width. Separately calculateVehicle line segment l ego Shortest distance to left and right road boundary sets. Alternatively, the line-to-line distance may be defined as:
f(l 1 ,l 2 )=min(D(l 1 ,l 2 s ),D(l 1 ,l 2 e ),D(l 2 ,l 1 s ),D(l 2 ,l 1 e ))
wherein D represents the distance from the point to the line segment, l s Representing the start of a line segment, l e Indicating the line segment end point.
Based on the above, it can be according to dist left =f(l ego ,lines left ) Obtaining the left road boundary width dist left According to dist right =f(l ego ,lines right ) Obtaining the right road boundary width dist right
According to the embodiment of the invention, the road boundary is discretized into the line segment construction axes to align and surround the rectangular tree, so that the processing scale of the data is within a certain scale, the range query time based on the line segment is reduced, the vehicle is abstracted into the line segment, the road boundary width is calculated through a mathematical model for calculating the road boundary width from the line segment to the line segment, and the calculation of the road boundary width is more accurate.
Example IV
Fig. 4 is a schematic structural diagram of a road boundary detecting device according to a fourth embodiment of the present invention. The road boundary detection device may be implemented in software and/or hardware, for example, the road boundary detection device may be configured in an unmanned vehicle. As shown in fig. 4, the apparatus includes a reference position determination module 410, a target boundary determination module 420, and a boundary width determination module 430, wherein:
a reference position determining module 410, configured to obtain current position information, and determine reference position information of road boundary detection based on the current position information;
a target boundary determination module 420 for determining target boundary information associated with the reference position information from a pre-built axis alignment bounding rectangular tree;
the boundary width determining module 430 is configured to determine a road boundary width according to the target boundary information and the reference position information.
According to the embodiment of the invention, the current position information is acquired through the reference position determining module, and the reference position information of road boundary detection is determined based on the current position information; the target boundary determining module determines target boundary information related to the reference position information from a pre-built axisymmetric surrounding rectangular tree, and determines the target boundary information through the pre-built axisymmetric surrounding rectangular tree, so that the searching process for determining the target boundary information is simplified, the boundary width determining module determines the road boundary width according to the target boundary information and the reference position information, the calculation amount of the road boundary width is simplified, the accuracy of the road boundary width is improved, and the accuracy of the road boundary detection result is ensured on the basis of improving the real-time performance of the road boundary detection.
Optionally, based on the above scheme, the target boundary determining module 420 is specifically configured to:
determining a reference matching range according to the reference position information;
acquiring rectangular coverage areas corresponding to all the axis alignment surrounding rectangles in the axis alignment surrounding rectangle tree, and taking the axis alignment surrounding rectangle corresponding to the rectangular coverage area intersected with the reference matching range as a target axis alignment surrounding rectangle;
and taking the road boundary line segment corresponding to the target axis alignment bounding rectangle as target boundary information.
Optionally, based on the above scheme, the boundary width determining module 430 includes:
a boundary information dividing unit that determines left boundary information and right boundary information in the target boundary information based on boundary position information and reference position information of the target boundary information;
and the boundary width calculation unit is used for determining the left road side boundary width according to the left boundary information and the reference position information and determining the right road side boundary width according to the right boundary information and the reference position information.
Optionally, on the basis of the above scheme, the boundary width calculating unit is specifically configured to:
calculating left road distance between each left boundary information and the reference position information, and taking the shortest left road distance as left road boundary width;
And calculating the right road distance between each right boundary information and the reference position information, and taking the shortest right road distance as the right road boundary width.
Optionally, on the basis of the above scheme, the method further comprises an axis alignment surrounding rectangular tree construction module for:
acquiring target boundary points of a road boundary, and generating a road boundary line segment based on adjacent target boundary points;
and constructing an axis alignment bounding rectangle corresponding to each road boundary line segment, and constructing an axis alignment bounding rectangle tree based on the axis alignment bounding rectangles.
Optionally, based on the above scheme, the axis alignment surrounding rectangular tree construction module is specifically configured to:
creating a target leaf node corresponding to each axis alignment bounding rectangle and determining a target level of the target leaf node;
creating a target branch node based on the target level, adding the target leaf node to the target branch node, adding an associated node associated with the target branch node to the target branch node, and deleting the associated node;
and connecting the target branch node with the existing node until all the axis alignment bounding rectangles are traversed, and obtaining the axis alignment bounding rectangle tree.
Optionally, based on the above scheme, the axis alignment surrounding rectangular tree construction module is specifically configured to:
and acquiring an original boundary point of the road boundary, and performing data preprocessing on the original boundary point to obtain a target boundary point.
The road boundary detection device provided by the embodiment of the invention can execute the road boundary detection method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example five
Fig. 5 is a schematic structural diagram of an unmanned vehicle according to a fifth embodiment of the present invention. Fig. 5 shows a block diagram of an exemplary drone 512 suitable for use in implementing embodiments of the present invention. The drone 512 shown in fig. 5 is merely an example, and should not be taken as limiting the functionality and scope of use of embodiments of the present invention.
As shown in fig. 5, drone 512 is in the form of a general purpose computing device. Components of drone 512 may include, but are not limited to: one or more processors 516, a system memory 528, a bus 518 that connects the various system components (including the system memory 528 and the processor 516). Optionally, the drone 512 also includes a sensor such as GPS for determining location information of the drone.
Bus 518 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor 516, or a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, micro channel architecture (MAC) bus, enhanced ISA bus, video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Drone 512 typically includes a variety of computer system readable media. Such media can be any available media that is accessible by drone 512 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 528 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM) 530 and/or cache memory 532. The drone 512 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage 534 may be used to read from or write to a non-removable, non-volatile magnetic media (not shown in FIG. 5, commonly referred to as a "hard disk drive"). Although not shown in fig. 5, a magnetic disk drive for reading from and writing to a removable non-volatile magnetic disk (e.g., a "floppy disk"), and an optical disk drive for reading from or writing to a removable non-volatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In such cases, each drive may be coupled to bus 518 through one or more data media interfaces. Memory 528 may include at least one program product having a set (e.g., at least one) of program modules configured to carry out the functions of embodiments of the invention.
A program/utility 540 having a set (at least one) of program modules 542 may be stored in, for example, memory 528, such program modules 542 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. Program modules 542 generally perform the functions and/or methods in the described embodiments of the invention.
The drone 512 may also communicate with one or more external devices 514 (e.g., keyboard, pointing device, display 524, etc.), one or more devices that enable a user to interact with the drone 512, and/or any devices (e.g., network card, modem, etc.) that enable the drone 512 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 522. Also, drone 512 may also communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) through network adapter 520. As shown, the network adapter 520 communicates with other modules of the drone 512 via the bus 518. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with drone 512, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
The processor 516 executes various functional applications and data processing by running programs stored in the system memory 528, for example, to implement the road boundary detection method provided by the embodiment of the present invention, the method includes:
acquiring current position information, and determining reference position information of road boundary detection based on the current position information;
determining target boundary information associated with the reference position information from a pre-built axis alignment bounding rectangular tree;
and determining the road boundary width according to the target boundary information and the reference position information.
Of course, those skilled in the art will understand that the processor may also implement the technical solution of the road boundary detection method provided in any embodiment of the present invention.
Example six
The sixth embodiment of the present invention also provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the road boundary detection method provided by the embodiment of the present invention, the method comprising:
acquiring current position information, and determining reference position information of road boundary detection based on the current position information;
determining target boundary information associated with the reference position information from a pre-built axis alignment bounding rectangular tree;
And determining the road boundary width according to the target boundary information and the reference position information.
Of course, the computer-readable storage medium provided by the embodiments of the present invention, on which the computer program stored, is not limited to the method operations described above, but may also perform the related operations of the road boundary detection method provided by any of the embodiments of the present invention.
The computer storage media of embodiments of the invention may take the form of any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
Note that the above is only a preferred embodiment of the present invention and the technical principle applied. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, while the invention has been described in connection with the above embodiments, the invention is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the invention, which is set forth in the following claims.

Claims (8)

1. A road boundary detection method, characterized by comprising:
acquiring current position information, and determining reference position information of road boundary detection based on the current position information;
determining target boundary information associated with the reference position information from a pre-built axis alignment bounding rectangular tree;
determining a road boundary width according to the target boundary information and the reference position information;
wherein the shaft alignment encloses construction of a rectangular tree, comprising:
acquiring a target boundary point of a road boundary, and generating a road boundary line segment based on the adjacent target boundary points;
Constructing an axis alignment bounding rectangle corresponding to each road boundary line segment, and constructing an axis alignment bounding rectangle tree based on the axis alignment bounding rectangles;
the determining target boundary information associated with the reference position information from a pre-built axis alignment bounding rectangular tree includes:
determining a reference matching range according to the reference position information;
acquiring rectangular coverage areas corresponding to all the axis alignment surrounding rectangles in the axis alignment surrounding rectangular tree, and taking the axis alignment surrounding rectangle corresponding to the rectangular coverage area intersected with the reference matching range as a target axis alignment surrounding rectangle;
and taking the road boundary line segment corresponding to the target axis alignment bounding rectangle as the target boundary information.
2. The method of claim 1, wherein the determining a road boundary width from the target boundary information and the reference position information comprises:
determining left boundary information and right boundary information in the target boundary information according to the boundary position information of the target boundary information and the reference position information;
and determining the width of the left road side boundary according to the left boundary information and the reference position information, and determining the width of the right road side boundary according to the right boundary information and the reference position information.
3. The method of claim 2, wherein the determining a left road side boundary width based on the left boundary information and the reference position information, and the determining a right road side boundary width based on the right boundary information and the reference position information, comprises:
calculating left road distance between each left boundary information and the reference position information, and taking the shortest left road distance as the left road boundary width;
and calculating the right road distance between the right boundary information and the reference position information, and taking the shortest right road distance as the right road boundary width.
4. The method of claim 1, wherein constructing an axis alignment bounding rectangle for each of the roadway boundary line segments, constructing an axis alignment bounding rectangle tree based on the axis alignment bounding rectangles, comprises:
creating a target leaf node corresponding to each axis alignment bounding rectangle and determining a target level of the target leaf node;
creating a target branch node based on the target level, adding the target leaf node to the target branch node, adding an associated node associated with the target branch node to the target branch node, and deleting the associated node;
And connecting the target branch node with the existing node until all the axis alignment bounding rectangles are traversed, and obtaining the axis alignment bounding rectangle tree.
5. The method of claim 1, wherein the obtaining the target coordinates of the road boundary comprises:
and acquiring an original boundary point of the road boundary, and performing data preprocessing on the original boundary point to obtain the target boundary point.
6. A road boundary detection apparatus, comprising:
the reference position determining module is used for acquiring current position information and determining reference position information of road boundary detection based on the current position information;
a target boundary determining module for determining target boundary information associated with the reference position information from a pre-built axis alignment bounding rectangular tree;
the boundary width determining module is used for determining the road boundary width according to the target boundary information and the reference position information;
the apparatus further includes an axis alignment bounding rectangular tree building module for:
acquiring target boundary points of a road boundary, and generating a road boundary line segment based on adjacent target boundary points;
constructing an axis alignment surrounding rectangle corresponding to each road boundary line segment, and constructing an axis alignment surrounding rectangle tree based on the axis alignment surrounding rectangle;
The target boundary determining module is specifically configured to:
determining a reference matching range according to the reference position information;
acquiring rectangular coverage areas corresponding to all the axis alignment surrounding rectangles in the axis alignment surrounding rectangular tree, and taking the axis alignment surrounding rectangle corresponding to the rectangular coverage area intersected with the reference matching range as a target axis alignment surrounding rectangle;
and taking the road boundary line segment corresponding to the target axis alignment bounding rectangle as the target boundary information.
7. An unmanned vehicle, the unmanned vehicle comprising:
one or more processors;
a storage means for storing one or more programs;
when executed by the one or more processors, causes the one or more processors to implement the roadway boundary detection method of any one of claims 1-5.
8. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the road-boundary detection method according to any one of claims 1-5.
CN202010538071.2A 2020-06-12 2020-06-12 Road boundary detection method, device, unmanned vehicle and storage medium Active CN113761990B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010538071.2A CN113761990B (en) 2020-06-12 2020-06-12 Road boundary detection method, device, unmanned vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010538071.2A CN113761990B (en) 2020-06-12 2020-06-12 Road boundary detection method, device, unmanned vehicle and storage medium

Publications (2)

Publication Number Publication Date
CN113761990A CN113761990A (en) 2021-12-07
CN113761990B true CN113761990B (en) 2024-04-09

Family

ID=78785381

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010538071.2A Active CN113761990B (en) 2020-06-12 2020-06-12 Road boundary detection method, device, unmanned vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN113761990B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011243161A (en) * 2010-05-21 2011-12-01 Denso Corp Lane boundary detection apparatus and lane boundary detection program
CN111095291A (en) * 2018-02-27 2020-05-01 辉达公司 Real-time detection of lanes and boundaries by autonomous vehicles
CN111104410A (en) * 2019-12-25 2020-05-05 北京经纬恒润科技有限公司 Local road information extraction method and device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4162618B2 (en) * 2004-03-12 2008-10-08 株式会社豊田中央研究所 Lane boundary judgment device
JP4622001B2 (en) * 2008-05-27 2011-02-02 トヨタ自動車株式会社 Road lane marking detection apparatus and road lane marking detection method
CN105704195B (en) * 2014-11-28 2019-12-10 国际商业机器公司 method and equipment for determining road network partition boundary line

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011243161A (en) * 2010-05-21 2011-12-01 Denso Corp Lane boundary detection apparatus and lane boundary detection program
CN111095291A (en) * 2018-02-27 2020-05-01 辉达公司 Real-time detection of lanes and boundaries by autonomous vehicles
CN111104410A (en) * 2019-12-25 2020-05-05 北京经纬恒润科技有限公司 Local road information extraction method and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Real time road edges detection and road signs recognition;Jianmin D.等;2015 International Conference on Control, Automation and Information Sciences (ICCAIS);全文 *
车辆自主导航中的道路边界识别算法;徐杰, 李晓虎, 王荣本, 施鹏飞;中国图象图形学报(第06期);全文 *

Also Published As

Publication number Publication date
CN113761990A (en) 2021-12-07

Similar Documents

Publication Publication Date Title
JP7082151B2 (en) Map trajectory matching data quality determination method, equipment, server and medium
CN110260870B (en) Map matching method, device, equipment and medium based on hidden Markov model
US11506769B2 (en) Method and device for detecting precision of internal parameter of laser radar
US10627520B2 (en) Method and apparatus for constructing reflectance map
EP3617997A1 (en) Method, apparatus, device, and storage medium for calibrating posture of moving obstacle
CN110427444B (en) Navigation guide point mining method, device, equipment and storage medium
CN107883974B (en) Navigation path planning method, navigation server and computer readable medium
CN110006439B (en) Map track data matching method, map track data matching device, server and storage medium
EP3794312B1 (en) Indoor location-based service
CN110542425B (en) Navigation path selection method, navigation device, computer equipment and readable medium
CN113010793A (en) Method, device, equipment, storage medium and program product for map data processing
CN109558854B (en) Obstacle sensing method and device, electronic equipment and storage medium
WO2018058888A1 (en) Street view image recognition method and apparatus, server and storage medium
CN110553658B (en) Navigation path recommendation method, navigation server, computer device and readable medium
CN110555432B (en) Method, device, equipment and medium for processing interest points
CN112798004A (en) Vehicle positioning method, device, equipment and storage medium
CN110647675A (en) Method and device for recognition of stop point and training of prediction model and storage medium
CN111354217A (en) Parking route determining method, device, equipment and medium
CN115510175A (en) Method and device for converting geographical coordinates of dwg data, computer equipment and medium
CN110555352A (en) interest point identification method, device, server and storage medium
CN114116946A (en) Data processing method and device, electronic equipment and storage medium
CN113722342A (en) High-precision map element change detection method, device and equipment and automatic driving vehicle
CN113761990B (en) Road boundary detection method, device, unmanned vehicle and storage medium
CN114578401B (en) Method and device for generating lane track points, electronic equipment and storage medium
CN112987707A (en) Automatic driving control method and device for vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant