CN117470269A - Method, device, equipment and storage medium for planning path of automatic driving vehicle - Google Patents

Method, device, equipment and storage medium for planning path of automatic driving vehicle Download PDF

Info

Publication number
CN117470269A
CN117470269A CN202311601456.9A CN202311601456A CN117470269A CN 117470269 A CN117470269 A CN 117470269A CN 202311601456 A CN202311601456 A CN 202311601456A CN 117470269 A CN117470269 A CN 117470269A
Authority
CN
China
Prior art keywords
point
projection
coordinates
lane line
automatic driving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311601456.9A
Other languages
Chinese (zh)
Inventor
赵廷栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202311601456.9A priority Critical patent/CN117470269A/en
Publication of CN117470269A publication Critical patent/CN117470269A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • G01C21/3492Special cost functions, i.e. other than distance or default speed limit of road segments employing speed data or traffic data, e.g. real-time or historical

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The disclosure provides a path planning method, device and equipment for an automatic driving vehicle and a storage medium, and relates to the technical field of artificial intelligence such as automatic driving and intelligent traffic. The method comprises the following steps: acquiring coordinate information of an automatic driving vehicle and attribute information of a lane line where the automatic driving vehicle is currently located; determining that the current lane line is a bidirectional single lane based on the attribute information, and determining the position relationship between the projection point corresponding to the automatic driving vehicle and the current lane line based on the coordinate information; determining a target projection point from the projection points based on the position relation, and determining a target lane line in a direction corresponding to the target projection point; based on the target lane lines, a target planned path is generated. The path planning method for the automatic driving vehicle provided by the disclosure realizes the planning of the path of the automatic driving vehicle in the bidirectional single-lane scene.

Description

Method, device, equipment and storage medium for planning path of automatic driving vehicle
Technical neighborhood
The disclosure relates to the technical field of artificial intelligence such as automatic driving and intelligent traffic, and in particular relates to a path planning method, device and equipment for an automatic driving vehicle and a storage medium.
Background
An automatic driving vehicle (Autonomous vehicles) is also called an unmanned automobile, a computer driving automobile or a wheel type mobile robot, and is an intelligent automobile for realizing unmanned through a computer system. The system is cooperated with a global positioning system by means of artificial intelligence, visual computing, a radar, a monitoring device and the global positioning system, so that a computer can automatically and safely operate a motor vehicle without any active operation of human beings. A two-way single lane is a single lane that allows for two-way passage of a car. When an autonomous vehicle drives automatically on these two-way single-lane segments, the problem of path planning is likely to occur, which can lead to problems in vehicle operation.
Disclosure of Invention
The disclosure provides a path planning method, a device, equipment and a storage medium for an automatic driving vehicle.
According to a first aspect of the present disclosure, there is provided a path planning method of an autonomous vehicle, comprising: acquiring coordinate information of an automatic driving vehicle and attribute information of a lane line where the automatic driving vehicle is currently located; determining that the current lane line is a bidirectional single lane based on the attribute information, and determining the position relationship between the projection point corresponding to the automatic driving vehicle and the current lane line based on the coordinate information; determining a target projection point from the projection points based on the position relation, and determining a target lane line in a direction corresponding to the target projection point; based on the target lane lines, a target planned path is generated.
According to a second aspect of the present disclosure, there is provided a path planning apparatus of an autonomous vehicle, comprising: the acquisition module is configured to acquire coordinate information of the automatic driving vehicle and attribute information of a lane line where the automatic driving vehicle is currently located; the first determining module is configured to determine the position relation between the projection point corresponding to the automatic driving vehicle and the lane line where the current lane line is located based on the coordinate information in response to determining that the lane line where the current lane line is located is a bidirectional single lane based on the attribute information; a second determining module configured to determine a target projection point from among the projection points based on the positional relationship, and determine a target lane line in a direction corresponding to the target projection point; the generation module is configured to generate a target planning path based on the target lane line.
According to a third aspect of the present disclosure, there is provided an electronic device comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method as described in any one of the implementations of the first aspect.
According to a fourth aspect of the present disclosure, there is provided a non-transitory computer readable storage medium storing computer instructions for causing a computer to perform a method as described in any one of the implementations of the first aspect.
According to a fifth aspect of the present disclosure, there is provided a computer program product comprising a computer program which, when executed by a processor, implements a method as described in any of the implementations of the first aspect.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
The drawings are for a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 is an exemplary system architecture diagram to which the present disclosure may be applied;
FIG. 2 is a flow chart of one embodiment of a path planning method of an autonomous vehicle according to the present disclosure;
FIG. 3 is a flow chart of another embodiment of a path planning method of an autonomous vehicle according to the present disclosure;
FIG. 4 is a flow chart of yet another embodiment of a path planning method of an autonomous vehicle according to the present disclosure;
FIG. 5 is a schematic structural view of one embodiment of a path planning apparatus of an autonomous vehicle according to the present disclosure;
fig. 6 is a block diagram of an electronic device for implementing a path planning method of an autonomous vehicle according to an embodiment of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present disclosure to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
It should be noted that, without conflict, the embodiments of the present disclosure and features of the embodiments may be combined with each other. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 illustrates an exemplary system architecture 100 to which embodiments of a path planning method of an autonomous vehicle or a path planning apparatus of an autonomous vehicle of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include a server 101, a network 102, and an autonomous vehicle 103. The network 102 serves as a medium to provide a communication link between the server 101 and the autonomous vehicle 103. Network 102 may include various connection types such as wired, wireless communication links, or fiber optic cables, among others.
The server 101 may provide various services. For example, the server 101 may analyze and process coordinate information acquired from the autonomous vehicle 103 and attribute information of a lane line where it is currently located, and generate a processing result (e.g., a target planned path).
The autonomous vehicle 103 may interact with the server 101 through the network 102 to receive or send messages or the like. For example, the autonomous vehicle 103 may acquire current coordinate information and attribute information of a lane line where it is currently located, and transmit the acquired information to the server 101.
It should be noted that, the path planning method for the autonomous vehicle provided in the embodiments of the present disclosure is generally executed by the server 101, and accordingly, the path planning device for the autonomous vehicle is generally disposed in the server 101.
It should be understood that the number of servers, networks, and autonomous vehicles in fig. 1 is merely illustrative. There may be any number of servers, networks, and autonomous vehicles, as desired for implementation.
With continued reference to fig. 2, a flow 200 of one embodiment of a path planning method for an autonomous vehicle according to the present disclosure is shown. The path planning method of the automatic driving vehicle comprises the following steps:
Step 201, acquiring coordinate information of an automatic driving vehicle and attribute information of a lane line where the automatic driving vehicle is currently located.
In this embodiment, the execution subject (e.g., the server 101 shown in fig. 1) of the path planning method of the autonomous vehicle acquires the coordinate information of the autonomous vehicle and the attribute information of the lane line where the autonomous vehicle is currently located. An automatic driving vehicle is also called an unmanned vehicle, a computer driving vehicle or a wheel type mobile robot, and is an intelligent vehicle for realizing unmanned through a computer system. The system is cooperated with a global positioning system by means of artificial intelligence, visual computing, a radar, a monitoring device and the global positioning system, so that a computer can automatically and safely operate a motor vehicle without any active operation of human beings. The executing body may acquire coordinate information of the autonomous vehicle, where the coordinate information may refer to real-time position coordinates of the vehicle, and may refer to start point coordinates and end point coordinates of the vehicle. The executing body also obtains attribute information of a lane line where the automatic driving vehicle is currently located, wherein the attribute information can include virtual and real information, color information and lane definition attributes (such as a lane line on the left side of a main lane, a lane line on the right side of the main lane and the like) of the lane line, and can also include information indicating whether the current lane line is a bidirectional single lane. And after acquiring the attribute information of the lane line where the automatic driving vehicle is currently located, the executing body judges whether the lane line where the automatic driving vehicle is currently located is a bidirectional single lane or not based on the attribute information.
And step 202, determining the position relation between the projection point corresponding to the automatic driving vehicle and the lane line where the current position is located based on the coordinate information in response to determining that the lane line where the current position is located is a bidirectional single lane based on the attribute information.
In this embodiment, if the executing body determines that the lane line where the vehicle is currently located is a bidirectional single lane (the bidirectional single lane is a single lane allowing the vehicle to pass in both directions) based on the attribute information, the executing body determines the positional relationship between the projection point corresponding to the automatic driving vehicle and the lane line where the vehicle is currently located based on the coordinate information. Here, the coordinate information generally refers to coordinates in a UTM coordinate system, which may be directly acquired by a sensor of the autonomous vehicle. Note that, the UTM (Universal Transverse Mercartor, universal transverse axis ink card holder) coordinate system uses UTM projection to map ellipsoidal segments into a planar rectangular coordinate system. Such coordinate systems and projections upon which they are based have been widely used for topography as reference grids for satellite imagery and natural resource databases and other applications requiring accurate positioning.
Specifically, the executing body projects the coordinates in the UTM coordinate system to the curved line SL coordinate system, that is, projects the real-time UTM coordinates of the autonomous vehicle to the SL coordinate system, so as to obtain the projection coordinates of the projection points corresponding to each real-time track point of the autonomous vehicle, where the projection coordinates are SL coordinates. The curve SL coordinate system is also called a frame, and is referred to as a road center line, S represents a direction of the road center line, and L represents a direction perpendicular to the road center line. That is, the SL coordinate system represents the lateral and longitudinal distances from the vehicle at a point, the lateral distance being L and the longitudinal distance being S. Therefore, the execution subject can determine the positional relationship between the projected point of the current position of the autonomous vehicle and the lane line where the current position is located, that is, whether the autonomous vehicle is currently located on the left or right side of the lane line, based on the L value in the SL coordinates. For example, if L is less than 0, determining that the autonomous vehicle is currently located on the right side of the lane line; if L is greater than 0, it is determined that the autonomous vehicle is currently located to the left of the lane line.
Step 203, determining a target projection point from the projection points based on the position relationship, and determining a target lane line corresponding to the target projection point.
In this embodiment, the executing body determines the target projection point from the projection points based on the positional relationship between the projection point corresponding to the autonomous vehicle and the lane line where the vehicle is currently located, and uses the lane line corresponding to the target projection point as the target lane line. That is, if the projection point corresponding to the autonomous vehicle is located on the right side of the lane line, and the traffic rule of right-right driving is combined, the projection point corresponding to each coordinate (for example, the current coordinate, the starting point coordinate, and the ending point coordinate) of the autonomous vehicle can be directly determined as the target projection point, and the lane line consistent with the direction of each projection point is taken as the target lane line, that is, the lane line where the autonomous vehicle is currently located is taken as the target lane line. If the projection point corresponding to the automatic driving vehicle is located on the left side of the lane line, and the traffic rule of right-right driving is combined, the executing main body firstly obtains the information of the lane line opposite to the current lane line, namely the reverse lane line of the current lane line, takes the reverse lane line as the target lane line, and then determines the projection point with the same direction as the reverse lane line as the target projection point in the projection points corresponding to the coordinates (such as the current coordinates, the starting point coordinates and the end point coordinates) of the automatic driving vehicle.
Step 204, generating a target planned path based on the target lane line.
In this embodiment, the executing body performs path planning of the autonomous vehicle based on the target lane line and the direction information of the target lane line, so as to generate a target planned path, and controls the autonomous vehicle to travel along the target planned path.
The method for planning the path of the automatic driving vehicle comprises the steps of firstly, acquiring coordinate information of the automatic driving vehicle and attribute information of a lane line where the automatic driving vehicle is currently located; then, determining the position relation between the projection point corresponding to the automatic driving vehicle and the lane line where the current position is located based on the coordinate information in response to determining that the lane line where the current position is located is a bidirectional single lane based on the attribute information; then, determining a target projection point from the projection points based on the position relation, and determining a target lane line in a direction corresponding to the target projection point; and finally, generating a target planning path based on the target lane line. According to the path planning method for the automatic driving vehicle, the path of the automatic driving vehicle is planned in a bidirectional single-lane scene, the driving capability of the automatic driving vehicle in the bidirectional single-lane is improved, the adaptability is high, and the implementation cost is low.
With continued reference to fig. 3, fig. 3 illustrates a flow 300 of another embodiment of a path planning method for an autonomous vehicle according to the present disclosure. The path planning method of the automatic driving vehicle comprises the following steps:
step 301, acquiring coordinate information of an automatic driving vehicle and attribute information of a lane line where the automatic driving vehicle is currently located.
In this embodiment, the execution subject (e.g., the server 101 shown in fig. 1) of the path planning method of the autonomous vehicle acquires the coordinate information of the autonomous vehicle and the attribute information of the lane line where the autonomous vehicle is currently located. Step 301 is substantially identical to step 201 of the foregoing embodiment, and reference may be made to the foregoing description of step 201 for specific implementation, which is not repeated herein.
In some alternative implementations of the present embodiment, the coordinate information is coordinates in a universal transverse axis mercator UTM coordinate system. The UTM coordinate system uses UTM projections to map ellipsoidal tiles into a planar rectangular coordinate system. Such coordinate systems and projections upon which they are based have been widely used for topography as reference grids for satellite imagery and natural resource databases and other applications requiring accurate positioning. Here, the coordinate information generally refers to coordinates in a UTM coordinate system, which may be directly acquired by a sensor mounted on the autonomous vehicle.
And step 302, in response to determining that the lane line where the current lane is located is a bidirectional single lane based on the attribute information, projecting coordinates in the UTM coordinate system to the curve SL coordinate system to obtain projection coordinates of the corresponding projection points.
In this embodiment, when the execution body determines that the lane line where the execution body is currently located is a bidirectional single lane based on the attribute information, the execution body projects the coordinates in the UTM coordinate system to the curve SL coordinate system, so as to obtain the projection coordinates of the corresponding projection points. The real-time UTM coordinates of the automatic driving vehicle are projected to a SL coordinate system, so that projection coordinates of projection points corresponding to all real-time track points of the automatic driving vehicle are obtained, and the projection coordinates are SL coordinates. Here, the SL coordinate system is a system in which a road center line is used as a reference, S represents a direction of the road center line, and L represents a direction perpendicular to the road center line. That is, the SL coordinate system represents the lateral and longitudinal distances from the vehicle at a point, the lateral distance being L and the longitudinal distance being S.
Step 303, determining the position relation between the projection point and the current lane line based on the projection coordinates.
In this embodiment, the executing body determines the positional relationship between the projection point and the lane line where the vehicle is currently located based on the projection coordinates. That is, the executing body determines the positional relationship between the projection point of the current position of the automatic driving vehicle and the lane line where the current position is located, that is, whether the automatic driving vehicle is currently located on the left side or the right side of the lane line, based on the L value in the SL coordinate. For example, if L is less than 0, determining that the autonomous vehicle is currently located on the right side of the lane line; if L is greater than 0, it is determined that the autonomous vehicle is currently located to the left of the lane line.
Through the steps, the position relation between the vehicle projection point and the current lane line can be directly judged based on the projection point coordinates.
Step 304, determining a target projection point from the projection points based on the position relation, and determining a target lane line corresponding to the target projection point.
In this embodiment, the execution subject determines a target projection point from among the projection points based on the positional relationship, and determines a target lane line in a direction corresponding to the target projection point.
Step 305, generating a target planned path based on the target lane line.
In this embodiment, the executing host performs path planning for the autonomous vehicle based on the target lane and the direction information of the target lane, so as to generate a target planned path, and controls the autonomous vehicle to travel along the target planned path.
Steps 304-305 are substantially identical to steps 203-204 of the previous embodiments, and reference may be made to the previous descriptions of steps 203-204 for specific implementations, which are not repeated herein.
As can be seen from fig. 3, compared with the embodiment corresponding to fig. 2, the method for planning the path of the automatic driving vehicle in this embodiment highlights the step of determining the positional relationship between the projection point corresponding to the automatic driving vehicle and the lane line where the automatic driving vehicle is currently located in the bidirectional single-lane scene, so that the positional relationship between the projection point and the lane line is more quickly and accurately determined, and the efficiency of path planning based on the positional relationship is also improved.
With continued reference to fig. 4, fig. 4 shows a flow 400 of yet another embodiment of a path planning method for an autonomous vehicle according to the present disclosure. The path planning method of the automatic driving vehicle comprises the following steps:
step 401, acquiring coordinate information of an automatic driving vehicle and attribute information of a lane line where the automatic driving vehicle is currently located.
In this embodiment, the execution subject (e.g., the server 101 shown in fig. 1) of the path planning method of the autonomous vehicle acquires the coordinate information of the autonomous vehicle and the attribute information of the lane line where the autonomous vehicle is currently located. Step 401 is substantially identical to step 201 of the foregoing embodiment, and specific implementation may refer to the foregoing description of step 201, which is not repeated herein.
In some optional implementations of the present embodiment, the coordinate information includes: the start point coordinates, the end point coordinates, and the current coordinates of the automatically driven vehicle of the preset route of the automatically driven vehicle.
In some implementations, the autonomous vehicle may be an autonomous bus, and for the autonomous bus, there is a preset route, where the preset route has a start point and an end point, and the executing entity obtains coordinates of the start point, coordinates of the end point, and real-time current coordinates of the autonomous vehicle. The current autopilot bus can complete road planning from point a to point b on the normal road section, and return the path information required by the PNC (planning and control ). However, the existing technical scheme fails to consider comprehensively, and when the position of the vehicle and the position of the destination are located on the road section of the bidirectional single lane, how to correctly inform the routing module of the route and the direction information of 2 stations, so that the routing module can give out reasonable route planning among 2 stations. The path planning method for the automatic driving vehicle can solve the path planning problem of the automatic driving bus in a two-way single-lane scene.
And step 402, in response to determining that the lane line where the current position is located is a bidirectional single lane based on the attribute information, respectively projecting the start point coordinate, the end point coordinate and the current coordinate to the SL coordinate system to obtain the start point projection coordinate of the corresponding start point projection point, the end point projection coordinate of the end point projection point and the current projection coordinate of the current projection point.
In this embodiment, when the execution body determines that the lane line where the execution body is currently located is a bidirectional single lane based on the attribute information, the execution body projects the start point coordinate, the end point coordinate and the current coordinate to the SL coordinate system, so as to obtain the start point projection coordinate of the corresponding start point projection point, the end point projection coordinate of the corresponding end point projection point and the current projection coordinate of the current projection point. The starting point coordinates, the end point coordinates and the real-time UTM coordinates of the automatic driving vehicle are projected under the SL coordinate system respectively, so that the corresponding starting point projection coordinates of the starting point projection points, the corresponding end point projection coordinates of the end point projection points and the corresponding current projection coordinates of the current projection points are obtained, and the projection coordinates are SL coordinates. Here, the SL coordinate system is a system in which a road center line is used as a reference, S represents a direction of the road center line, and L represents a direction perpendicular to the road center line.
In some optional implementations of the present embodiment, the attribute information includes: a set of discrete point coordinates; step 402 includes: determining discrete points closest to the starting point coordinates, the end point coordinates and the current coordinates from the discrete point coordinate set respectively, and marking the discrete points as a first discrete point, a second discrete point and a third discrete point respectively; constructing a first vector based on the start point coordinates and the coordinates of the first discrete points, constructing a second vector based on the end point coordinates and the coordinates of the second discrete points, and constructing a third vector based on the current coordinates and the coordinates of the third discrete points; and respectively projecting the first vector, the second vector and the third vector to the SL coordinate system to obtain corresponding starting point projection coordinates, corresponding end point projection coordinates and corresponding current projection coordinates.
That is, after acquiring the coordinates of the autonomous vehicle, the executing body acquires lane line attribute information of the point, such as lane line ID (Identity document, identification number) and discrete point coordinate set of the lane, by using the high-precision map interface.
Specifically, after acquiring the coordinates of the starting point of the automatic driving vehicle, the executing body acquires the lane line attribute (including the lane line ID and the discrete point coordinate set) of the starting point by using a high-precision map interface, and then determines the discrete point closest to the starting point coordinate set from the discrete point coordinate set, and marks the discrete point as the first discrete point. And constructing a first vector based on the starting point coordinates and the coordinates of the first discrete points, and projecting the first vector to the SL coordinate system by using vector projection to obtain corresponding starting point projection coordinates, namely converting the starting point UTM coordinates into SL coordinates.
After acquiring the destination coordinates of the automatic driving vehicle, the executing body acquires lane line attributes (including lane line ID and discrete point coordinate set) of the destination by using a high-precision map interface, and then determines a discrete point closest to the destination coordinates from the discrete point coordinate set, and marks the discrete point as a second discrete point. And constructing a second vector based on the end point coordinates and the coordinates of the second discrete points, and projecting the second vector to the SL coordinate system by using vector projection to obtain corresponding end point projection coordinates, namely converting the end point UTM coordinates into SL coordinates.
After acquiring the current coordinate of the automatic driving vehicle, the executing body acquires the lane line attribute (including the lane line ID and the discrete point coordinate set) of the current point by using the high-precision map interface, and then determines the discrete point closest to the current coordinate from the discrete point coordinate set, and marks the discrete point as a third discrete point. And constructing a third vector based on the current coordinate and the coordinate of the third discrete point, and projecting the third vector to the SL coordinate system by utilizing vector projection to obtain a corresponding current projection coordinate, namely converting the current UTM coordinate into the SL coordinate.
Therefore, the starting point projection coordinates, the end point projection coordinates and the current projection coordinates are calculated based on the vector projection mode.
Step 403, determining the position relation between the projection point and the current lane line based on the projection coordinates.
In this embodiment, the execution subject determines the positional relationship between the projection point and the lane line where the execution subject is currently located based on the projection coordinates. Since the start point projection coordinates, the end point projection coordinates and the current projection coordinates are all SL coordinates, the execution subject may determine the positional relationship between the start point and the lane line where the current location is based on the start point SL coordinates, determine the positional relationship between the end point and the lane line where the current location is based on the end point SL coordinates, and determine the positional relationship between the current point and the lane line where the current location is based on the current point SL coordinates.
In some optional implementations of this embodiment, the positional relationship includes: the projection point corresponding to the automatic driving vehicle is positioned on the left side of the lane line where the automatic driving vehicle is currently positioned or the projection point corresponding to the automatic driving vehicle is positioned on the right side of the lane line where the automatic driving vehicle is currently positioned. Since the SL coordinate system represents the lateral and longitudinal distances of a point from the vehicle, the lateral distance is L and the longitudinal distance is S. Therefore, the executing body determines the positional relationship between the projection point of the current position of the automatic driving vehicle and the lane line where the current position is located, that is, whether the automatic driving vehicle is currently located on the left side or the right side of the lane line, based on the L value in the SL coordinate. For example, if L is less than 0, determining that the autonomous vehicle is currently located on the right side of the lane line; if L is greater than 0, it is determined that the autonomous vehicle is currently located to the left of the lane line.
Step 404, determining a target projection point from the projection points based on the position relationship, and determining a target lane line corresponding to the target projection point.
In this embodiment, the execution subject determines a target projection point from among the projection points based on the positional relationship, and determines a target lane line in a direction corresponding to the target projection point.
In some alternative implementations of the present embodiment, step 404 includes: responding to the fact that the projection point corresponding to the automatic driving vehicle is determined to be positioned on the right side of the lane line where the automatic driving vehicle is positioned currently, taking the starting point projection point, the ending point projection point and the current projection point as target projection points, and taking the lane line where the automatic driving vehicle is positioned currently as a target lane line; and in response to determining that the projection point corresponding to the automatic driving vehicle is positioned on the left side of the lane line where the automatic driving vehicle is positioned, taking the reverse lane line of the lane line where the automatic driving vehicle is positioned as a target lane line, and taking the projection points in the same direction as the reverse lane line in the starting point projection point, the ending point projection point and the current projection point as target projection points.
In this implementation manner, if it is determined that the projection point corresponding to the automatically driven vehicle is located on the right side of the lane line where the vehicle is currently located, and the traffic rule that the vehicle is right to travel is combined, the starting point projection point, the ending point projection point and the current projection point can be directly used as the target projection points, and the lane line where the vehicle is currently located can be used as the target lane line. If it is determined that the projection point corresponding to the automatically driven vehicle is located on the left side of the lane line where the vehicle is currently located, and the traffic rule that the vehicle is right to travel is combined, the executing body may acquire information of a reverse lane line of the current lane line, take the reverse lane line as a target lane line, and then take a starting point projection point, an end point projection point and a projection point with the same direction as the reverse lane line in the current projection point as the target projection point. Thus, the target projection points and the target lane lines under different position relations are determined.
Step 405, a target planned path is generated based on the target lane line.
In this embodiment, the execution subject generates the target planned path based on the target lane line.
In some optional implementations of this embodiment, the method for path planning of an autonomous vehicle further includes: and in response to determining that the lane line where the current position is located is a non-bidirectional single lane, eliminating the discrete points which are not satisfied with the preset condition from the discrete point coordinate set based on the distance between the current coordinate and each discrete point in the discrete point coordinate set.
In this implementation manner, when the execution body determines that the lane line where the automatic driving vehicle is currently located is a non-bidirectional single lane, a path point rejection algorithm is added in addition to the original processing, and according to the distance between the current coordinates of the automatic driving vehicle and each discrete point in the discrete point coordinate set of the lane line, discrete points which are not satisfied with the preset condition are rejected from the discrete point coordinate set, so as to prevent planning of the path of the winding.
As can be seen from fig. 4, compared with the embodiment corresponding to fig. 3, the method for planning the path of the automatic driving vehicle in this embodiment highlights the step of determining the target projection point and the target lane line in the bidirectional single-lane scene, so that the path planning of the automatic driving vehicle is more quickly and accurately performed.
With further reference to fig. 5, as an implementation of the method shown in the foregoing figures, the present disclosure provides an embodiment of a path planning apparatus for an autonomous vehicle, which corresponds to the method embodiment shown in fig. 2, and which is particularly applicable to various electronic devices.
As shown in fig. 5, the path planning apparatus 500 of the autonomous vehicle of the present embodiment includes: an acquisition module 501, a first determination module 502, a second determination module 503, and a generation module 504. The acquiring module 501 is configured to acquire coordinate information of an automatic driving vehicle and attribute information of a lane line where the automatic driving vehicle is currently located; the first determining module 502 is configured to determine a position relationship between a projection point corresponding to the automatic driving vehicle and the lane line where the current lane line is located based on the coordinate information in response to determining that the lane line where the current lane line is located is a bidirectional single lane based on the attribute information; a second determining module 503 configured to determine a target projection point from the projection points based on the positional relationship, and determine a target lane line in a direction corresponding to the target projection point; the generation module 504 is configured to generate a target planned path based on the target lane line.
In the present embodiment, in the path planning apparatus 500 of the autonomous vehicle: the specific processing of the obtaining module 501, the first determining module 502, the second determining module 503 and the generating module 504 and the technical effects thereof may refer to the relevant descriptions of steps 201 to 204 in the corresponding embodiment of fig. 2, and are not repeated herein.
In some optional implementations of this embodiment, the coordinate information is coordinates in a universal transverse axis mercator UTM coordinate system; the first determination module includes: the projection sub-module is configured to project the coordinates under the UTM coordinate system to the curve SL coordinate system to obtain projection coordinates of the corresponding projection points; the determining submodule is configured to determine the position relation between the projection point and the lane line where the projection point is currently located based on the projection coordinates.
In some optional implementations of the present embodiment, the coordinate information includes: starting point coordinates, end point coordinates and current coordinates of a preset route of the automatic driving vehicle; the projection submodule includes: and the projection unit is configured to project the starting point coordinates, the end point coordinates and the current coordinates to the SL coordinate system respectively to obtain the starting point projection coordinates of the corresponding starting point projection points, the end point projection coordinates of the corresponding end point projection points and the current projection coordinates of the current projection points.
In some optional implementations of the present embodiment, the attribute information includes: a set of discrete point coordinates; and the projection unit is further configured to: determining discrete points closest to the starting point coordinates, the end point coordinates and the current coordinates from the discrete point coordinate set respectively, and marking the discrete points as a first discrete point, a second discrete point and a third discrete point respectively; constructing a first vector based on the start point coordinates and the coordinates of the first discrete points, constructing a second vector based on the end point coordinates and the coordinates of the second discrete points, and constructing a third vector based on the current coordinates and the coordinates of the third discrete points; and respectively projecting the first vector, the second vector and the third vector to the SL coordinate system to obtain corresponding starting point projection coordinates, corresponding end point projection coordinates and corresponding current projection coordinates.
In some optional implementations of this embodiment, the positional relationship includes: the projection point corresponding to the automatic driving vehicle is positioned on the left side of the lane line where the automatic driving vehicle is currently positioned or the projection point corresponding to the automatic driving vehicle is positioned on the right side of the lane line where the automatic driving vehicle is currently positioned; and the second determination module is further configured to: responding to the fact that the projection point corresponding to the automatic driving vehicle is determined to be positioned on the right side of the lane line where the automatic driving vehicle is positioned currently, taking the starting point projection point, the ending point projection point and the current projection point as target projection points, and taking the lane line where the automatic driving vehicle is positioned currently as a target lane line; and in response to determining that the projection point corresponding to the automatic driving vehicle is positioned on the left side of the lane line where the automatic driving vehicle is positioned, taking the reverse lane line of the lane line where the automatic driving vehicle is positioned as a target lane line, and taking the projection points in the same direction as the reverse lane line in the starting point projection point, the ending point projection point and the current projection point as target projection points.
In some optional implementations of the present embodiment, the path planning apparatus 500 of an autonomous vehicle further includes: and the rejecting module is configured to reject the discrete points which are not more than the preset condition from the discrete point coordinate set based on the distance between the current coordinate and each discrete point in the discrete point coordinate set in response to determining that the lane line where the current position is located is a non-bidirectional single lane.
According to embodiments of the present disclosure, the present disclosure also provides an electronic device, a readable storage medium and a computer program product.
Fig. 6 illustrates a schematic block diagram of an example electronic device 600 that may be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 6, the apparatus 600 includes a computing unit 601 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 602 or a computer program loaded from a storage unit 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data required for the operation of the device 600 may also be stored. The computing unit 601, ROM 602, and RAM 603 are connected to each other by a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Various components in the device 600 are connected to the I/O interface 605, including: an input unit 606 such as a keyboard, mouse, etc.; an output unit 607 such as various types of displays, speakers, and the like; a storage unit 608, such as a magnetic disk, optical disk, or the like; and a communication unit 609 such as a network card, modem, wireless communication transceiver, etc. The communication unit 609 allows the device 600 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The computing unit 601 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 601 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The computing unit 601 performs the various methods and processes described above, such as a path planning method of an autonomous vehicle. For example, in some embodiments, the path planning method of an autonomous vehicle may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as the storage unit 608. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 600 via the ROM 602 and/or the communication unit 609. When the computer program is loaded into the RAM 603 and executed by the computing unit 601, one or more steps of the path planning method of an autonomous vehicle described above may be performed. Alternatively, in other embodiments, the computing unit 601 may be configured to perform the path planning method of the autonomous vehicle by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server incorporating a blockchain.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps recited in the present disclosure may be performed in parallel or sequentially or in a different order, provided that the desired results of the technical solutions of the present disclosure are achieved, and are not limited herein.
The above detailed description should not be taken as limiting the scope of the present disclosure. Those skilled in the art will appreciate that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present disclosure are intended to be included within the scope of the present disclosure.

Claims (15)

1. A path planning method of an autonomous vehicle, comprising:
acquiring coordinate information of an automatic driving vehicle and attribute information of a lane line where the automatic driving vehicle is currently located;
determining that the lane line where the automatic driving vehicle is currently located is a bidirectional single lane based on the attribute information, and determining the position relation between the projection point corresponding to the automatic driving vehicle and the lane line where the automatic driving vehicle is currently located based on the coordinate information;
Determining a target projection point from the projection points based on the position relation, and determining a target lane line in a direction corresponding to the target projection point;
and generating a target planning path based on the target lane line.
2. The method of claim 1, wherein the coordinate information is coordinates in a universal transverse axis mercator UTM coordinate system; and
the determining, based on the coordinate information, a positional relationship between a projection point corresponding to the autonomous vehicle and the lane line where the autonomous vehicle is currently located, includes:
projecting the coordinates in the UTM coordinate system to a curve SL coordinate system to obtain projection coordinates of the corresponding projection points;
and determining the position relation between the projection point and the lane line where the current position is based on the projection coordinates.
3. The method of claim 2, wherein the coordinate information comprises: the starting point coordinates and the end point coordinates of a preset route of the automatic driving vehicle and the current coordinates of the automatic driving vehicle; and
the projecting the coordinates in the UTM coordinate system to the curve SL coordinate system to obtain projection coordinates of the corresponding projection points, including:
and respectively projecting the starting point coordinate, the end point coordinate and the current coordinate to the SL coordinate system to obtain a starting point projection coordinate of a corresponding starting point projection point, an end point projection coordinate of an end point projection point and a current projection coordinate of a current projection point.
4. A method according to claim 3, wherein the attribute information comprises: a set of discrete point coordinates; and
the step of projecting the start point coordinate, the end point coordinate and the current coordinate to the SL coordinate system to obtain a start point projection coordinate of a corresponding start point projection point, an end point projection coordinate of an end point projection point and a current projection coordinate of a current projection point, respectively, includes:
determining discrete points closest to the starting point coordinates, the end point coordinates and the current coordinates from the discrete point coordinate set respectively, and marking the discrete points as a first discrete point, a second discrete point and a third discrete point respectively;
constructing a first vector based on the start point coordinates and the coordinates of the first discrete point, constructing a second vector based on the end point coordinates and the coordinates of the second discrete point, and constructing a third vector based on the current coordinates and the coordinates of the third discrete point;
and respectively projecting the first vector, the second vector and the third vector to the SL coordinate system to obtain corresponding starting point projection coordinates, corresponding ending point projection coordinates and corresponding current projection coordinates.
5. The method of claim 4, wherein the positional relationship comprises: the projection point corresponding to the automatic driving vehicle is positioned at the left side of the lane line where the automatic driving vehicle is currently positioned or the projection point corresponding to the automatic driving vehicle is positioned at the right side of the lane line where the automatic driving vehicle is currently positioned; and
The determining, based on the positional relationship, a target projection point from the projection points, and determining a target lane line in a direction corresponding to the target projection point, includes:
in response to determining that a projection point corresponding to the automatic driving vehicle is positioned on the right side of the current lane line, taking the starting point projection point, the ending point projection point and the current projection point as target projection points, and taking the current lane line as the target lane line;
and responding to the fact that the projection point corresponding to the automatic driving vehicle is positioned at the left side of the lane line where the automatic driving vehicle is positioned, taking the reverse lane line of the lane line where the automatic driving vehicle is positioned as the target lane line, and taking the projection points in the same direction as the reverse lane line in the starting point projection point, the end point projection point and the current projection point as the target projection point.
6. The method of claim 4, further comprising:
and responding to the fact that the lane line where the current position is located is a non-bidirectional single lane, and eliminating discrete points which are not more than a preset condition from the discrete point coordinate set based on the distance between the current coordinate and each discrete point in the discrete point coordinate set.
7. A path planning apparatus of an autonomous vehicle, comprising:
the acquisition module is configured to acquire coordinate information of the automatic driving vehicle and attribute information of a lane line where the automatic driving vehicle is currently located;
the first determining module is configured to determine the position relation between the projection point corresponding to the automatic driving vehicle and the lane line where the current is located based on the coordinate information in response to determining that the lane line where the current is located is a bidirectional single lane based on the attribute information;
a second determining module configured to determine a target projection point from the projection points based on the positional relationship, and determine a target lane line in a direction corresponding to the target projection point;
and the generation module is configured to generate a target planning path based on the target lane line.
8. The apparatus of claim 7, wherein the coordinate information is coordinates in a universal transverse axis mercator UTM coordinate system; and
the first determining module includes:
the projection sub-module is configured to project the coordinates in the UTM coordinate system to a curve SL coordinate system to obtain projection coordinates of the corresponding projection points;
and the determining submodule is configured to determine the position relation between the projection point and the lane line where the projection point is currently positioned based on the projection coordinates.
9. The apparatus of claim 8, wherein the coordinate information comprises: the starting point coordinates and the end point coordinates of a preset route of the automatic driving vehicle and the current coordinates of the automatic driving vehicle; and
the projection submodule includes:
and the projection unit is configured to respectively project the starting point coordinate, the ending point coordinate and the current coordinate to the SL coordinate system to obtain a starting point projection coordinate of a corresponding starting point projection point, an ending point projection coordinate of an ending point projection point and a current projection coordinate of a current projection point.
10. The apparatus of claim 9, wherein the attribute information comprises: a set of discrete point coordinates; and
the projection unit is further configured to:
determining discrete points closest to the starting point coordinates, the end point coordinates and the current coordinates from the discrete point coordinate set respectively, and marking the discrete points as a first discrete point, a second discrete point and a third discrete point respectively;
constructing a first vector based on the start point coordinates and the coordinates of the first discrete point, constructing a second vector based on the end point coordinates and the coordinates of the second discrete point, and constructing a third vector based on the current coordinates and the coordinates of the third discrete point;
And respectively projecting the first vector, the second vector and the third vector to the SL coordinate system to obtain corresponding starting point projection coordinates, corresponding ending point projection coordinates and corresponding current projection coordinates.
11. The apparatus of claim 10, wherein the positional relationship comprises: the projection point corresponding to the automatic driving vehicle is positioned at the left side of the lane line where the automatic driving vehicle is currently positioned or the projection point corresponding to the automatic driving vehicle is positioned at the right side of the lane line where the automatic driving vehicle is currently positioned; and
the second determination module is further configured to:
in response to determining that a projection point corresponding to the automatic driving vehicle is positioned on the right side of the current lane line, taking the starting point projection point, the ending point projection point and the current projection point as target projection points, and taking the current lane line as the target lane line;
and responding to the fact that the projection point corresponding to the automatic driving vehicle is positioned at the left side of the lane line where the automatic driving vehicle is positioned, taking the reverse lane line of the lane line where the automatic driving vehicle is positioned as the target lane line, and taking the projection points in the same direction as the reverse lane line in the starting point projection point, the end point projection point and the current projection point as the target projection point.
12. The apparatus of claim 10, further comprising:
and the rejecting module is configured to reject discrete points which are not more than a preset condition from the discrete point coordinate set based on the distance between the current coordinate and each discrete point in the discrete point coordinate set in response to determining that the lane line where the current position is positioned is a non-bidirectional single lane.
13. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-6.
14. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of claims 1-6.
15. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any of claims 1-6.
CN202311601456.9A 2023-11-28 2023-11-28 Method, device, equipment and storage medium for planning path of automatic driving vehicle Pending CN117470269A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311601456.9A CN117470269A (en) 2023-11-28 2023-11-28 Method, device, equipment and storage medium for planning path of automatic driving vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311601456.9A CN117470269A (en) 2023-11-28 2023-11-28 Method, device, equipment and storage medium for planning path of automatic driving vehicle

Publications (1)

Publication Number Publication Date
CN117470269A true CN117470269A (en) 2024-01-30

Family

ID=89629387

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311601456.9A Pending CN117470269A (en) 2023-11-28 2023-11-28 Method, device, equipment and storage medium for planning path of automatic driving vehicle

Country Status (1)

Country Link
CN (1) CN117470269A (en)

Similar Documents

Publication Publication Date Title
CN111209978B (en) Three-dimensional visual repositioning method and device, computing equipment and storage medium
CN113899363B (en) Vehicle positioning method and device and automatic driving vehicle
CN112560680A (en) Lane line processing method and device, electronic device and storage medium
CN111081033B (en) Method and device for determining orientation angle of vehicle
CN113223113B (en) Lane line processing method and device, electronic equipment and cloud control platform
CN114506343A (en) Trajectory planning method, device, equipment, storage medium and automatic driving vehicle
CN115855084A (en) Map data fusion method and device, electronic equipment and automatic driving product
EP4119413A1 (en) Error compensation method and apparatus, computer device, and storage medium
CN114047760A (en) Path planning method and device, electronic equipment and automatic driving vehicle
CN113932796A (en) High-precision map lane line generation method and device and electronic equipment
CN113722342A (en) High-precision map element change detection method, device and equipment and automatic driving vehicle
CN113126120A (en) Data annotation method, device, equipment, storage medium and computer program product
CN115900697B (en) Object motion trail information processing method, electronic equipment and automatic driving vehicle
CN112925302A (en) Robot pose control method and device
CN116215517A (en) Collision detection method, device, apparatus, storage medium, and autonomous vehicle
CN117470269A (en) Method, device, equipment and storage medium for planning path of automatic driving vehicle
CN115761702A (en) Vehicle track generation method and device, electronic equipment and computer readable medium
CN116091567A (en) Registration method and device of automatic driving vehicle, electronic equipment and vehicle
CN114581869A (en) Method and device for determining position of target object, electronic equipment and storage medium
CN113449798A (en) Port unmanned driving map generation method and device, electronic equipment and storage medium
CN114429631A (en) Three-dimensional object detection method, device, equipment and storage medium
CN113191279A (en) Data annotation method, device, equipment, storage medium and computer program product
CN114407916B (en) Vehicle control and model training method and device, vehicle, equipment and storage medium
CN112966059B (en) Data processing method and device for positioning data, electronic equipment and medium
CN115294764B (en) Crosswalk area determination method, crosswalk area determination device, crosswalk area determination equipment and automatic driving vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination