CN115657684B - Vehicle path information generation method, device, equipment and computer readable medium - Google Patents

Vehicle path information generation method, device, equipment and computer readable medium Download PDF

Info

Publication number
CN115657684B
CN115657684B CN202211568606.6A CN202211568606A CN115657684B CN 115657684 B CN115657684 B CN 115657684B CN 202211568606 A CN202211568606 A CN 202211568606A CN 115657684 B CN115657684 B CN 115657684B
Authority
CN
China
Prior art keywords
information
lane
obstacle
vehicle
feature information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211568606.6A
Other languages
Chinese (zh)
Other versions
CN115657684A (en
Inventor
倪凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Heduo Technology (Guangzhou) Co.,Ltd.
Original Assignee
HoloMatic Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HoloMatic Technology Beijing Co Ltd filed Critical HoloMatic Technology Beijing Co Ltd
Priority to CN202211568606.6A priority Critical patent/CN115657684B/en
Publication of CN115657684A publication Critical patent/CN115657684A/en
Application granted granted Critical
Publication of CN115657684B publication Critical patent/CN115657684B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The embodiment of the disclosure discloses a vehicle path information generation method, a vehicle path information generation device, a vehicle path information generation equipment and a computer readable medium. One embodiment of the method comprises: acquiring current vehicle sensing information, lane line sensing information, a first obstacle feature information set, a second obstacle feature information set and a lane path information sequence; performing fusion processing on the first obstacle characteristic information set and the second obstacle characteristic information set to obtain a current lane obstacle characteristic information set and a left lane obstacle characteristic information set; generating lane change expectation information based on the current vehicle perception information and the current lane obstacle characteristic information group; generating channel merging information in response to determining that the channel changing expected information meets a preset expected channel changing condition; detecting and processing the obstacle characteristic information group of the left lane to obtain lane change decision information; generating track changing track information; and generating vehicle path information and sending the vehicle path information to the vehicle control module to control the vehicle to run. This embodiment can shorten the running time of the vehicle.

Description

Vehicle path information generation method, device, equipment and computer readable medium
Technical Field
The embodiment of the disclosure relates to the technical field of computers, in particular to a vehicle path information generation method, device, equipment and computer readable medium.
Background
A vehicle path information generation method is an important technology in the field of intelligent driving. At present, when generating vehicle route information, the following methods are generally adopted: and planning and outputting vehicle path information consisting of various lane paths by the vehicle-mounted navigation equipment according to the current position and the destination of the vehicle so as to control the vehicle to safely drive to the destination along the planned path.
However, the inventors have found that when the vehicle path information is generated in the above manner, there are often technical problems as follows:
firstly, although the vehicle can travel to a destination along a planned route, the vehicle has to be lower than a desired vehicle speed in order to keep a safe distance in the long-term following process, so that the vehicle takes longer time to travel and the vehicle traveling efficiency is reduced;
secondly, if the vehicle changes lanes halfway to improve the speed, the vehicle path information needs to be regenerated according to the position and the destination of the vehicle by relying on vehicle navigation, so that more computing resources are occupied;
thirdly, in the process of changing lanes of the vehicle, emergencies such as emergency deceleration of the vehicle in front of the target lane or sudden acceleration of the vehicle behind the target lane may be encountered, and if the vehicle is driven to change lanes according to the originally planned path, the vehicle may collide with the environmental vehicle, so that the driving safety of the vehicle is reduced.
The above information disclosed in this background section is only for enhancement of understanding of the background of the inventive concept and, therefore, it may contain information that does not form the prior art that is already known to a person of ordinary skill in the art in this country.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Some embodiments of the present disclosure propose a vehicle path information generation method, apparatus, device and computer readable medium to solve one or more of the technical problems mentioned in the background section above.
In a first aspect, some embodiments of the present disclosure provide a vehicle path information generating method, including: acquiring current vehicle perception information, lane line perception information, a first obstacle feature information set, a second obstacle feature information set and a lane path information sequence; based on the lane line perception information, performing fusion processing on the first obstacle feature information set and the second obstacle feature information set to obtain a current lane obstacle feature information group and a left lane obstacle feature information group; generating lane change expectation information based on the current vehicle perception information and the current lane obstacle feature information group; generating lane merging information based on the current vehicle perception information and the lane path information sequence in response to determining that the lane changing expectation information meets a preset expectation lane changing condition; in response to the fact that the merging information meets the preset distance condition, detecting and processing the obstacle feature information group of the left lane to obtain lane changing decision information; generating lane change track information based on a preset high-precision map, the current vehicle perception information, the left lane obstacle feature information group and the lane line perception information in response to the fact that the lane change decision information meets a preset lane change condition; generating vehicle path information based on the track changing track information, the merging information and the lane path information sequence, and sending the vehicle path information to a vehicle control module for controlling a vehicle to drive, wherein the generating the vehicle path information comprises: generating combined track information based on the track-changing track information and the combined track information; intercepting the lane path information sequence based on the lane merging information to obtain an intercepted lane path information sequence; and determining the track changing track information, the combined track information and the intercepted lane path information sequence as vehicle path information.
In a second aspect, some embodiments of the present disclosure provide a vehicle path information generating apparatus including: an acquisition unit configured to acquire current vehicle perception information, lane line perception information, a first obstacle feature information set, a second obstacle feature information set, and a lane path information sequence; a fusion processing unit configured to perform fusion processing on the first obstacle feature information set and the second obstacle feature information set based on the lane line perception information to obtain a current lane obstacle feature information set and a left lane obstacle feature information set; a first generating unit configured to generate lane change expectation information based on the current vehicle perception information and the current lane obstacle feature information group; a second generating unit configured to generate merging information based on the current vehicle perception information and the lane path information sequence in response to determining that the lane change expectation information satisfies a preset expectation lane change condition; the detection processing unit is configured to perform detection processing on the obstacle characteristic information group of the left lane to obtain lane change decision information in response to the fact that the lane merging information meets the preset distance condition; a third generating unit configured to generate lane change trajectory information based on a preset high-precision map, the current vehicle perception information, the left lane obstacle feature information group, and the lane line perception information in response to determining that the lane change decision information satisfies a preset lane change condition; a generating and transmitting unit configured to generate vehicle path information based on the lane change track information, the merging information, and the lane path information sequence, and transmit the vehicle path information to a vehicle control module for controlling a vehicle to drive, wherein the generating of the vehicle path information includes: generating combined track information based on the track-changing track information and the combined track information; intercepting the lane path information sequence based on the lane merging information to obtain an intercepted lane path information sequence; and determining the track changing track information, the combined track information and the intercepted lane path information sequence as vehicle path information.
In a third aspect, some embodiments of the present disclosure provide an electronic device, comprising: one or more processors; a storage device having one or more programs stored thereon, which when executed by one or more processors, cause the one or more processors to implement the method described in any of the implementations of the first aspect.
In a fourth aspect, some embodiments of the present disclosure provide a computer-readable medium on which a computer program is stored, wherein the computer program, when executed by a processor, implements the method described in any of the implementations of the first aspect.
The above embodiments of the present disclosure have the following advantages: by the vehicle path information generation method of some embodiments of the disclosure, the vehicle driving time can be shortened. Specifically, the reason why it takes a long time for the vehicle to travel is that: although the vehicle can travel along the planned route to the destination, the vehicle has to travel below the desired speed in order to maintain a safe distance during long-term following. Based on this, the vehicle path information generating method of some embodiments of the present disclosure first obtains current vehicle perception information, lane line perception information, a first obstacle feature information set, a second obstacle feature information set, and a lane path information sequence. The information of the current vehicle, the lane line on the road where the current vehicle is located and the obstacle vehicle can be obtained, and lane change decision information can be generated according to the information for determining whether the current vehicle needs to change the lane to the left vehicle for accelerated running. And then, based on the lane line perception information, performing fusion processing on the first obstacle characteristic information set and the second obstacle characteristic information set to obtain a current lane obstacle characteristic information group and a left lane obstacle characteristic information group. Therefore, lane change expectation information can be generated according to the information of the obstacle vehicle on the current lane, and lane change decision information can be generated according to the information of the obstacle vehicle on the left lane. And then, generating lane change expectation information based on the current vehicle perception information and the current lane obstacle characteristic information group. And generating lane merging information based on the current vehicle perception information and the lane path information sequence in response to the fact that the lane changing expectation information meets the preset expectation lane changing condition. And in response to the fact that the lane merging information meets the preset distance condition, detecting and processing the obstacle feature information group of the left lane to obtain lane changing decision information. Therefore, whether the current vehicle needs to change the lane to the left vehicle for acceleration running can be determined according to the lane change expectation information of the current vehicle. Considering that the current vehicle needs to continuously drive towards the direction close to the destination, merging information is further generated on the basis of lane change expectation information, and lane change decision information is finally determined according to the information of the obstacle vehicle on the left lane, so that whether a lane change track is generated for the current vehicle or not is determined according to the lane change decision information for the vehicle to change the lane to the left express lane is determined subsequently, the speed of the vehicle is improved, and the time consumed by the vehicle in driving is shortened. And then, in response to the fact that the lane change decision information meets the preset lane change condition, generating lane change track information based on a preset high-precision map, the current vehicle perception information, the left lane obstacle feature information group and the lane line perception information. Therefore, the lane change decision information can be used for determining that the current vehicle is about to change the lane to the left lane, and on the basis, the lane change track information is generated to be used for the current vehicle to perform the lane change operation. And finally, generating vehicle path information based on the track changing track information, the merging information and the lane path information sequence, and sending the vehicle path information to a vehicle control module to control the vehicle to drive. Wherein the generating of the vehicle path information includes: generating combined track information based on the track changing track information and the combined track information; intercepting the lane path information sequence based on the lane merging information to obtain an intercepted lane path information sequence; and determining the track changing track information, the track merging track information and the intercepted lane path information sequence as vehicle path information. Thus, the vehicle path information can be generated for the vehicle to travel with a high vehicle speed, thereby shortening the travel time period. Therefore, the vehicle path information generation method can improve the driving speed by changing the lane to the express way on the premise that the vehicle keeps a safe distance with the front vehicle. And because the vehicle after lane change still needs to drive towards the direction close to the destination, the consideration of the lane merging information is added in the determination of the lane change decision information, and the lane merging information is also considered in the later generation of the vehicle path information so as to ensure that the vehicle drives along the shortest path planned when the vehicle is at the departure place as much as possible. Thus, the running time of the vehicle can be shortened. Further, the efficiency of vehicle travel can be improved.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and elements are not necessarily drawn to scale.
FIG. 1 is a flow diagram of some embodiments of a vehicle path information generation method according to the present disclosure;
FIG. 2 is a schematic block diagram of some embodiments of a vehicle path information generating device according to the present disclosure;
FIG. 3 is a schematic block diagram of an electronic device suitable for use in implementing some embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the disclosure are shown in the drawings, it is to be understood that the disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings. The embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a" or "an" in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will appreciate that references to "one or more" are intended to be exemplary and not limiting unless the context clearly indicates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 illustrates a flow 100 of some embodiments of a vehicle path information generation method according to the present disclosure. The vehicle path information generation method includes the steps of:
step 101, obtaining current vehicle perception information, lane line perception information, a first obstacle feature information set, a second obstacle feature information set and a lane path information sequence.
In some embodiments, an execution subject (e.g., a vehicle-mounted terminal) of the vehicle path information generation method may acquire the current vehicle sensing information, the lane line sensing information, the first obstacle feature information set, the second obstacle feature information set, and the lane path information sequence through a wired connection manner or a wireless connection manner. The current vehicle sensing information may include a vehicle speed value and a current vehicle coordinate. The vehicle speed value may be a current vehicle speed value output by the inertial navigation apparatus. The current vehicle coordinates may be GPS (Global Positioning System) Positioning coordinates of the current vehicle output by the in-vehicle navigation apparatus. The lane line perception information may be information of lane lines on both sides of a lane where the current vehicle is located, which is output by the smart camera. The lane line sensing information may include left lane line information and right lane line information. The left lane line information may include a left lane line curve and a left lane distance value. The right lane line information may include a right lane line curve and a right lane line distance value. The left lane line curve may represent a left lane line of a lane in which the current vehicle is located. The left lane distance value may be a distance value from a center point of a front axle of the current vehicle to the left lane line. The right lane line curve may represent a right lane line of a lane in which the current vehicle is located. The right lane distance value may be a distance value from a center point of a front axle of the vehicle to a right lane line. The first obstacle feature information in the first obstacle feature information set may be information of an obstacle vehicle output by the smart camera. The obstacle vehicle may be a vehicle type obstacle. The first obstacle feature information in the first obstacle feature information set may include, but is not limited to, at least one of: first azimuth angle information, a first lateral distance value, and a first longitudinal distance value. The first azimuth information may include an azimuth identifier and an angle value. The orientation identifier may uniquely identify the orientation of the obstacle vehicle relative to the current vehicle. For example, the above-described orientation indicator may be "front left". The first lateral distance value may be a lateral distance value between the obstacle vehicle and the current vehicle. The first longitudinal distance value may be a longitudinal distance value between the obstacle vehicle and the current vehicle. The above-described second obstacle feature information set may be information of the obstacle vehicle output by the millimeter wave radar. The second obstacle feature information in the second obstacle feature information set may include, but is not limited to, at least one of: second azimuth information, a second lateral distance value, and a second longitudinal distance value. The second azimuth information may be information of an azimuth and an angle of the obstacle vehicle deviating from the current vehicle traveling direction. The second azimuth information may include an azimuth identifier and an angle value. The second lateral distance value may be a distance value of the obstacle vehicle from the current vehicle in the lateral direction. The second longitudinal distance value may be a longitudinal distance value between the obstacle vehicle and the current vehicle. The lane path information in the lane path information sequence may be information of a lane and a path on the lane where the current vehicle travels from the start point to the end point, which is planned and output by the vehicle-mounted navigation apparatus. The lane path information in the lane path information sequence may include a lane identification, a road identification, an adjacent lane information group, a lane start coordinate, a lane path trajectory curve, and a path length value. The lane identification may be a unique identification for the lane. The road mark can be a unique mark for the road where the lane is located. The adjacent lane information in the adjacent lane information group may include a left adjacent lane marking and a right adjacent lane marking. The left adjacent lane marking may be a marking of an adjacent lane to the left of the corresponding lane. The above-mentioned right adjacent lane marking may be a marking of an adjacent lane to the right of the corresponding lane. The lanes in the lane group may be lanes arranged in parallel in the same direction. The lane start coordinates may be coordinates of a start point on a center line of the lane. The lane path trajectory curves described above may be used to characterize a path on a corresponding lane. The path length value may be a length value of a path on a corresponding lane.
And 102, fusing the first obstacle characteristic information set and the second obstacle characteristic information set based on lane line perception information to obtain a current lane obstacle characteristic information group and a left lane obstacle characteristic information group.
In some embodiments, the executing entity may perform fusion processing on the first obstacle feature information set and the second obstacle feature information set in various ways based on the lane line perception information to obtain a current lane obstacle feature information set and a left lane obstacle feature information set. The current lane obstacle feature information in the current lane obstacle feature information group can be used for representing obstacle vehicles on the current lane. The current lane may be the lane in which the current vehicle is located. The left lane obstacle feature information in the left lane obstacle feature information group can be used for representing obstacle vehicles on the left lane. The left lane may be a left adjacent lane of a lane where the vehicle is currently located.
In some optional implementation manners of some embodiments, the executing body may perform fusion processing on the first obstacle feature information set and the second obstacle feature information set based on the lane line perception information to obtain a current lane obstacle feature information set and a left lane obstacle feature information set by:
the first step is that each first obstacle feature information in the first obstacle feature information set is subjected to redundancy removal processing, and a first obstacle feature information set after redundancy removal is obtained. The redundancy-removed first obstacle feature information set may be the redundancy-removed first obstacle feature information set. Firstly, clustering processing is carried out on each first obstacle feature information of the first obstacle feature information set to obtain a first cluster obstacle feature information group set. Each first cluster obstacle feature information set in the first cluster obstacle feature information set can be used for representing the same obstacle vehicle. The first cluster obstacle feature information in the first cluster obstacle feature information group may be information corresponding to the same obstacle vehicle. The first obstacle feature information set can be clustered through a preset clustering algorithm according to a first transverse distance value and a first longitudinal distance value included in each first obstacle feature information, so that a first clustered obstacle feature information group set is obtained. Then, for each first cluster obstacle feature information group in the first cluster obstacle feature information group, determining any first cluster obstacle feature information in the first cluster obstacle feature information group as the redundancy-removed first obstacle feature information.
As an example, the preset clustering algorithm may include, but is not limited to, at least one of the following: a density-based clustering algorithm and a K-means algorithm.
And secondly, performing redundancy removal processing on each second obstacle characteristic information in the second obstacle characteristic information sets to obtain redundancy-removed second obstacle characteristic information sets. Firstly, clustering processing is carried out on each second obstacle feature information of the second obstacle feature information set to obtain a second cluster obstacle feature information group set. Wherein each of the second cluster obstacle feature information sets may be used to characterize the same obstacle vehicle. The second cluster obstacle feature information in the second cluster obstacle feature information group may be information corresponding to the same obstacle vehicle. The second obstacle feature information set may be clustered by the preset clustering algorithm according to a second transverse distance value and a second longitudinal distance value included in each second obstacle feature information, so as to obtain a second clustered obstacle feature information group set. Then, for each second cluster obstacle feature information set in the second cluster obstacle feature information set, any second cluster obstacle feature information in the second cluster obstacle feature information set is determined as the second obstacle feature information after redundancy removal.
And thirdly, determining the redundancy-removed second obstacle characteristic information matched with the redundancy-removed first obstacle characteristic information in the redundancy-removed second obstacle characteristic information set as the obstacle characteristic information to be deleted for each redundancy-removed first obstacle characteristic information in the redundancy-removed first obstacle characteristic information set. The matching with the redundancy-removed first obstacle feature information may be that an error value between a second transverse distance value included in the redundancy-removed second obstacle feature information and a second transverse distance value included in the redundancy-removed first obstacle feature information satisfies a preset error threshold, and an error value between a second longitudinal distance value included in the redundancy-removed second obstacle feature information and a second longitudinal distance value included in the redundancy-removed first obstacle feature information satisfies a preset error threshold. The preset error threshold may be a preset error threshold. For example, the preset error threshold may be 0.1 meter. The obstacle feature information to be deleted may be the second obstacle feature information to be deleted after redundancy removal.
And fourthly, deleting the redundancy-removed second obstacle characteristic information matched with each obtained obstacle characteristic information to be deleted in the redundancy-removed second obstacle characteristic information set to obtain a deleted second obstacle characteristic information set. The matching with each obtained obstacle feature information to be deleted may be that the redundancy-removed second obstacle feature information is the same as any obstacle feature information to be deleted. The post-deletion second obstacle feature information in the post-deletion second obstacle feature information set may be post-redundancy removal second obstacle feature information different from each post-redundancy removal first obstacle feature information. First, for each obtained obstacle feature information to be deleted, the redundancy-removed second obstacle feature information that is the same as the obstacle feature information to be deleted may be deleted from the redundancy-removed second obstacle feature information set. Then, the deleted redundancy-removed second obstacle feature information set is determined as a deleted second obstacle feature information set.
And fifthly, generating a target obstacle feature information set based on the redundancy-removed first obstacle feature information set and the deleted second obstacle feature information set. The target obstacle feature information in the target obstacle feature information set can be used for representing different obstacle vehicles. Each of the redundancy-removed first obstacle feature information in the redundancy-removed first obstacle feature information set and each of the deleted second obstacle feature information in the deleted second obstacle feature information set may be determined as target obstacle feature information, so as to obtain a target obstacle feature information set.
And sixthly, classifying the target obstacle feature information set based on the lane line perception information to obtain a current lane obstacle feature information group and a left lane obstacle feature information group. The current lane obstacle feature information group and the left lane obstacle feature information group can be obtained through a preset classification method. The preset classification method may be a method of projecting the lane lines and the obstacle vehicles to a vehicle body coordinate system according to the lane lines and the characteristic information of the obstacle vehicles, and then distinguishing the obstacle vehicles corresponding to different lanes according to the coordinate position relationship. The preset classification method may be specifically performed as follows:
the first substep is to project the left lane line curve and the right lane line curve included in the lane line sensing information to a vehicle body coordinate system based on a preset camera conversion matrix to obtain a left projected lane line curve and a right projected lane line curve. The preset camera transformation matrix may be a predefined camera transformation matrix. The predetermined camera transformation matrix can be used for representing the transformation relation of coordinates between the camera coordinate system and the vehicle body coordinate system. The left projected lane line curve may be used to represent a left lane line of a lane in which the current vehicle is located. The right side projected lane line curve may be used to represent a right side lane line of a lane in which the current vehicle is located.
And a second substep of generating an obstacle coordinate set based on the target obstacle feature information set. The obstacle coordinates in the obstacle coordinate set may be coordinates of the obstacle vehicle in the vehicle body coordinate system. The obstacle coordinates in the obstacle coordinate set may include a horizontal axis coordinate value and a vertical axis coordinate value. For each target obstacle feature information in the target obstacle feature information set, the corresponding obstacle coordinates of the obstacle vehicle in the vehicle body coordinate system can be determined according to the azimuth angle, the transverse distance value and the longitudinal distance value between the obstacle vehicle and the current vehicle in the target obstacle feature information.
A third substep of, for each obstacle coordinate in the set of obstacle coordinates, performing the steps of:
step one, determining a longitudinal axis coordinate value included in the obstacle coordinate as a target longitudinal axis coordinate value.
And step two, determining the horizontal axis coordinate value of the point meeting the first preset coordinate value condition on the left side projected lane line curve as a first left side horizontal axis coordinate value. The first preset coordinate value condition may be that a coordinate value of a vertical axis of a point on the left projected lane line curve is a target coordinate value of a vertical axis.
And step three, determining the horizontal axis coordinate value of the point meeting the second preset coordinate value condition on the right side projected lane line curve as the first right side horizontal axis coordinate value. The second preset coordinate value condition may be that the coordinate value of the vertical axis of the point on the right projected lane line curve is a target coordinate value of the vertical axis.
And step four, in response to the fact that the coordinate value of the horizontal axis included in the obstacle coordinates meets the interval condition of the preset coordinate value, determining the target obstacle feature information corresponding to the obstacle coordinates as the obstacle feature information of the current lane. The preset coordinate value interval condition may be that the obstacle coordinate includes a horizontal axis coordinate value between the first left horizontal axis coordinate value and the first right horizontal axis coordinate value.
And step five, in response to the fact that the horizontal axis coordinate value included in the obstacle coordinates is smaller than the first left horizontal axis coordinate value, determining target obstacle feature information corresponding to the obstacle coordinates as left lane obstacle feature information.
And 103, generating lane change expectation information based on the current vehicle perception information and the current lane obstacle feature information group.
In some embodiments, the execution subject may generate lane change expectation information based on the current vehicle sensing information and the current lane obstacle feature information set in various ways. The lane change expectation information may be information on whether the speed of the current vehicle needs to reach a preset speed value through lane change.
In some optional implementations of some embodiments, the current vehicle perception information may include a vehicle speed value. Each current lane obstacle feature information in the current lane obstacle feature information group may include a longitudinal distance value. The longitudinal distance value may be a first longitudinal distance value or a second longitudinal distance value. The execution main body can generate the lane change expected information through the following steps:
the method comprises the steps of firstly, selecting current lane obstacle feature information meeting preset front direction conditions from the current lane obstacle feature information group as front obstacle vehicle feature information, and obtaining a front obstacle vehicle feature information group. The preset front direction condition may be that the direction identifier included in the current lane obstacle feature information is the same as any preset direction identifier in the preset direction identifier group. The preset azimuth identifier in the preset azimuth identifier group may be a preset azimuth identifier. The front obstacle vehicle characteristic information in the front obstacle vehicle characteristic information group is used for representing the obstacle vehicle which runs ahead of the current vehicle on the current lane.
And secondly, selecting the front obstacle vehicle characteristic information meeting the preset longitudinal distance condition from the front obstacle vehicle characteristic information group as target obstacle vehicle characteristic information. The preset longitudinal distance condition may be that the longitudinal distance value included in the front obstacle vehicle characteristic information is the minimum value among the longitudinal distance values included in the front obstacle vehicle characteristic information group. The target obstacle vehicle characteristic information can be used for representing the obstacle vehicle which is positioned in front of the current vehicle on the current lane and has the smallest distance value with the current vehicle.
And thirdly, acquiring a speed threshold corresponding to the longitudinal distance value included in the characteristic information of the target obstacle vehicle. The speed threshold may be a maximum boundary value of the vehicle speed. The speed threshold value and the longitudinal distance value relation table in the cache can be read, and the speed threshold value corresponding to the longitudinal distance value included in the target obstacle vehicle characteristic information can be obtained from the speed threshold value and the longitudinal distance value relation table. The speed threshold and longitudinal distance value relation table may include a speed distance relation record. The velocity distance relationship record described above may be used to characterize a one-to-one correspondence between velocity thresholds and longitudinal distance values.
And fourthly, in response to the fact that the vehicle speed value included in the current vehicle perception information is smaller than or equal to the speed threshold value, determining whether a preset expected speed value is larger than the speed threshold value. The preset desired speed value may be a preset vehicle speed value that the driver desires to reach. First, a difference between a preset desired speed value and the speed threshold is determined as a speed difference. The speed difference may be a difference between a maximum speed value that can be achieved when there is no collision risk between the current vehicle and the obstacle vehicle and a vehicle speed value that the driver expects to achieve. Then, if the speed difference is greater than 0, the preset expected speed value is greater than the speed threshold. Finally, if the speed difference is smaller than or equal to 0, the preset expected speed value is smaller than or equal to the speed threshold value.
And fifthly, in response to the fact that the preset expected speed value is larger than the speed threshold value, determining the preset expected speed value and a preset left lane changing mark as lane changing expected information. The preset left lane changing identifier can be used for representing that the lane is changed to the left. For example, the preset left lane change flag may be represented by 1.
Optionally, the executing body may further determine, in response to determining that the preset expected speed value is less than or equal to the speed threshold, the speed threshold and a preset lane change-free identifier as lane change expected information. The preset lane changing-free mark can be used for representing that the current vehicle does not change the lane and continues to run in the current lane. For example, the preset lane change-free flag may be represented by 0.
And 104, in response to the fact that the lane change expected information meets the preset expected lane change condition, generating lane merging information based on the current vehicle perception information and the lane path information sequence.
In some embodiments, the executing body may generate the merge information based on the current vehicle perception information and the lane path information sequence in response to determining that the lane change expectation information satisfies a preset expectation lane change condition. The preset expected lane change condition may be that the lane change expected information includes a preset left lane change identifier. The merge information may be information required for merging vehicles. The merge information may be generated by:
firstly, obtaining a map lane mark corresponding to the current vehicle coordinate included in the current vehicle perception information. The map lane mark can be a mark of a lane where a current vehicle is located on a high-precision map. The map lane identification corresponding to the current vehicle coordinate can be obtained through a high-precision map interface.
And secondly, determining a serial number corresponding to the lane path information matched with the map lane mark in the lane path information sequence as a current lane serial number, and determining the lane path information matched with the map lane mark in the lane path information sequence as current lane path information. The matching with the map lane mark may be that the lane mark included in the lane path information is the same as the map lane mark. The current lane path information may be used to characterize the lane in which the current vehicle is located. The current lane number may be a number of the current lane in the lane path.
And thirdly, determining the sum of the current lane serial number and 1 as a target lane serial number.
Fourthly, based on the sequence number of the target lane, executing the following coordinate generation step of the starting end of the merging lane:
the first substep determines lane route information having a sequence number of the target lane sequence number among the lane route information sequence as target lane route information.
And in response to the fact that the adjacent lane information group included in the target lane path information meets the preset adjacent lane condition, determining the start coordinate of the lane corresponding to the target lane path information as the start coordinate of the lane merging lane. The preset adjacent lane condition may be that no left adjacent lane marker exists in an adjacent lane information group included in the target lane path information.
And fifthly, acquiring a target path length value based on the current vehicle coordinate and the start end coordinate of the merging lane. The target path length value may be a length value of a path from the current vehicle coordinate to the merging lane start coordinate of the current vehicle. And acquiring a target path length value from the current vehicle coordinate to the start end coordinate of the merging lane through vehicle navigation.
And sixthly, determining the coordinates of the starting end of the merging lane, the sequence number of the target lane, the path information of the target lane and the length value of the target path as merging information.
Optionally, the executing body may further perform the following steps:
a first step of determining a difference between the target lane number and 1 as a previous lane number in response to a determination that an adjacent lane information group included in the target lane path information does not satisfy the preset adjacent lane condition. The preceding lane number may correspond to lane route information that is arranged at a position immediately before the target lane route information in the lane route information sequence.
And secondly, determining whether the road mark corresponding to the target lane path information is the same as the road mark corresponding to the lane path information with the sequence number of the precursor lane in the lane path information sequence.
And thirdly, determining the initial coordinate of the lane corresponding to the target lane path information as the initial coordinate of the merging lane in response to determining that the road mark corresponding to the target lane path information is different from the road mark corresponding to the lane path information with the sequence number of the predecessor lane in the lane path information sequence.
Optionally, the executing body may further determine, as the target lane number, a sum of 1 and a serial number of the target lane path information in the lane path information sequence in response to determining that the road sign corresponding to the target lane path information is the same as the road sign corresponding to the lane path information having the serial number of the preceding lane in the lane path information sequence, and execute the merging lane start coordinate generating step again.
And 105, responding to the fact that the lane merging information meets the preset distance condition, detecting and processing the obstacle characteristic information group of the left lane to obtain lane changing decision information.
In some embodiments, the executing body may perform detection processing on the left lane obstacle feature information set in various ways in response to determining that the merging information satisfies a preset distance condition, so as to obtain lane change decision information. The preset distance condition may be that a target path length value included in the merge information is greater than or equal to a preset distance value. The preset distance value may be a preset distance value. For example, the preset distance value may be 1.5 km.
Optionally, each left lane obstacle feature information in the left lane obstacle feature information group may further include a longitudinal relative velocity value and a lateral relative velocity value. The longitudinal relative speed value may be a longitudinal relative speed value from the current vehicle to the obstacle vehicle. The above-mentioned lateral relative speed value may be a current vehicle-to-obstacle vehicle lateral relative speed value. The lane change decision information can be used for representing whether the left lane has collision risk or not. The lane change decision information can be generated by detecting and processing the obstacle feature information group of the left lane through the following steps:
first, a left-side obstacle integrated distance value group is generated based on the left-side lane obstacle feature information group. The left-side obstacle integrated distance value in the left-side obstacle integrated distance value group may be a distance value between a current vehicle front axle center and an obstacle vehicle front axle center. For each left lane obstacle feature information in the left lane obstacle feature information group, a left obstacle comprehensive distance value may be generated by using a first lateral distance value and a first longitudinal distance value corresponding to the left lane obstacle feature information, or a second lateral distance value and a second longitudinal distance value corresponding to the left lane obstacle feature information, according to a triangular distance formula.
And secondly, selecting a left-side obstacle comprehensive distance value meeting a preset minimum distance value condition from the left-side obstacle comprehensive distance value group, and determining the left-side lane obstacle feature information corresponding to the selected left-side obstacle comprehensive distance value as left-side target obstacle feature information. The preset minimum distance value condition may be that the left-side obstacle integrated distance value is a minimum value in the left-side obstacle integrated distance value set. The left target obstacle feature information may be used to characterize an obstacle vehicle on the left lane with the smallest distance value from the current vehicle.
And thirdly, generating a left collision duration value based on the left target obstacle characteristic information. The left-side collision duration value may be a duration value required for a collision between the current vehicle and the obstacle vehicle. First, a quotient of a first longitudinal distance value or a second longitudinal distance value included in the left target obstacle feature information and a longitudinal relative velocity value is determined as a longitudinal risk duration value. The longitudinal risk duration value may be a duration value required for a collision between the current vehicle and the obstacle vehicle in the longitudinal direction. And then, determining the quotient of the first transverse distance value or the second transverse distance value included in the left target obstacle characteristic information and the transverse relative speed value as a transverse risk duration value. The lateral risk duration value may be a duration value required for a lateral collision between the current vehicle and the obstacle vehicle. And finally, determining the minimum value of the longitudinal risk duration value and the transverse risk duration value as a left side collision duration value.
And fourthly, generating channel change decision information based on the left side collision duration value. Firstly, in response to the fact that the left side collision duration value is larger than the preset duration threshold value, the preset lane changing identification is determined as lane changing decision information. The preset time length threshold may be a preset threshold. For example, the preset time threshold may be 2 seconds. The preset lane-changing mark can be used for representing that the current vehicle can be changed to a left lane. And then, in response to the fact that the left side collision duration value is smaller than or equal to the preset duration threshold value, determining a preset non-lane changing identifier as lane changing decision information. The preset lane-unchangeable mark can be used for representing that the current vehicle can not be changed to the left lane.
And 106, in response to the fact that the lane change decision information meets the preset lane change condition, generating lane change track information based on a preset high-precision map, current vehicle perception information, a left lane obstacle feature information group and lane line perception information.
In some embodiments, the execution subject may generate the lane change trajectory information based on a preset high-precision map, the current vehicle perception information, the left lane obstacle feature information group, and the lane line perception information in response to determining that the lane change decision information satisfies a preset lane change condition. The preset lane change condition may be that the lane change decision information includes a preset lane change identifier. The lane change trajectory information may be used to characterize a path on which the current vehicle travels from the current position lane change to the center line of the left lane. The track-changing track information may include, but is not limited to, at least one of the following: the lane change trajectory curve, the coordinates of the starting point, the coordinates of the ending point and the coordinates of the lane separation point. The lane change trajectory curve described above may be used to characterize the path of the vehicle traveling from the current lane change to the adjacent lane. The start point coordinates may be current vehicle coordinates. The end point coordinates may be coordinates at the end of the current vehicle lane change in the high-precision map coordinate system. The coordinates of the lane dividing points may be coordinates of points that divide the lane change trajectory curve into two segments on the lane line under the high-precision map coordinate system. The lane change trajectory information may be generated based on a preset high-precision map, the current vehicle perception information, the left lane obstacle feature information group, and the lane line perception information through a preset lane change trajectory planning algorithm.
As an example, the preset lane change trajectory planning algorithm may include, but is not limited to, at least one of the following: a sine function track changing track and a polynomial function track changing track.
Step 107, generating vehicle path information based on the track changing track information, the merging information and the lane path information sequence, and sending the vehicle path information to a vehicle control module for controlling the vehicle to drive, wherein the generating the vehicle path information comprises:
step 1071, generating combined track information based on the track change track information and the combined track information.
In some embodiments, the execution body may generate the merge track information based on the swap track information and the merge track information. The merging trajectory information may be information of a path trajectory from an end point coordinate included in the lane change trajectory information to a merging lane start point coordinate included in the merging trajectory information. The merged track information may be generated by:
first, driving area information is generated based on the target lane map information, the current lane path information, the lane change trajectory information, and the merge information. The driving area information may be information of an expected driving area of the vehicle merging lane. First, a straight line perpendicular to the vehicle traveling direction and passing through the coordinates of the end point included in the lane change trajectory information is determined as a first straight line, and a straight line perpendicular to the vehicle traveling direction and passing through the coordinates of the start end of the merging lane included in the merging lane information is determined as a second straight line. Secondly, determining the coordinates of the intersection point between the first straight line and the left lane line curve of the lane where the end point coordinates are located as first intersection point coordinates, and determining the coordinates of the intersection point between the first straight line and the right lane line curve of the lane where the end point coordinates are located as second intersection point coordinates. And then, determining the coordinate of an intersection point between the second straight line and the left lane line curve of the lane where the start end coordinate of the lane merging lane is located as a third intersection point coordinate, and determining the coordinate of an intersection point between the first straight line and the right lane line curve of the lane where the start end coordinate of the lane merging lane is located as a fourth intersection point coordinate. Thereafter, a straight line passing through the first intersection coordinate and the third intersection coordinate is determined as a third straight line, and a straight line passing through the second intersection coordinate and the fourth intersection coordinate is determined as a fourth straight line. And finally, determining information of a closed area enclosed by the first straight line, the second straight line, the third straight line and the fourth straight line as driving area information.
And a second step of generating a boundary line segment group and a triangular area information set based on the driving area information. The boundary line segment in the boundary line segment group may be a line segment corresponding to a common boundary between two regions. The triangular region information in the triangular region information set may be information of a triangular region where the vehicle is expected to travel on the road. First, the coordinates of the midpoint of the second intersection point coordinates and the third intersection point coordinates are determined as target midpoint coordinates. And then, respectively connecting the first intersection point coordinate, the second intersection point coordinate, the third intersection point coordinate and the fourth intersection point coordinate with the target midpoint coordinate to obtain a boundary information group. The boundary information in the boundary information group may be used to characterize a line segment on a common boundary between two regions. And finally, dividing the closed area corresponding to the driving area information into triangular areas according to the boundary information in the boundary information group, and determining the information corresponding to each triangular area as triangular area information to obtain a triangular area information set.
And thirdly, generating a connected region information group based on the boundary information group and the triangular region information set through a preset connected region identification algorithm. The connected region information in the connected region information group comprises a connected boundary line segment information group and a target triangular region information sequence. The connected boundary line segment information in the connected boundary line segment information group may be information of a line segment on a common boundary between the two areas. The connected boundary line segment information in the connected boundary line segment information group may include a first endpoint coordinate and a second endpoint coordinate. The first endpoint coordinates may be coordinates of an endpoint of the line segment. The coordinates of the second end point may be coordinates of an end point at the other end of the line segment. The target triangular region information sequence can be used for representing a connected region covering the coordinates of the end point included in the track changing track information and the coordinates of the start end of the merging lane included in the merging information. The preset connected region identification algorithm may be a preset algorithm for performing connectivity identification on each triangular region corresponding to the triangular region information set according to the boundary information group.
And fourthly, generating a to-be-screened merging track information group based on the connected region information group. The merged track information to be screened in the merged track information group to be screened may be any piece of information of a path track from a terminal coordinate included in the track-changing track information to a start terminal coordinate of a merged lane included in the merged track information. The merged track information to be filtered in the merged track information group to be filtered may include a track path length value. The track path length value may be a length value of a path corresponding to a curved track. For each of the set of connected region information, the following steps may be performed:
and a first substep of generating a merging key point coordinate set based on a connected boundary line segment information set included in the connected region information. The merging key point coordinate in the merging key point coordinate set may be a midpoint coordinate of the boundary line segment. For each connected boundary line segment information in the connected boundary line segment information group included in the connected region information, a coordinate of a midpoint between a first endpoint coordinate and a second endpoint coordinate included in the connected boundary line segment information may be determined as a merging key point coordinate.
And a second substep of generating merging track information to be screened based on the merging key point coordinate set, the end point coordinate included in the track changing track information and the merging lane start end coordinate included in the merging track information. And generating the track information to be screened and combined through the preset track changing planning algorithm.
And fifthly, taking the length value of each track path in the combined track information group to be screened as a track path length value group to be screened, and selecting the length value of the track path to be screened which meets preset screening conditions from the track path length value group to be screened as a target length value. The track length value to be screened in the track length value group to be screened may be a length value of a track from an end point coordinate included in the track change track information to a start point coordinate of a merging lane included in the merging information. The preset screening condition may be that the length value of the track path to be screened is the minimum value in the set of length values of the track path to be screened. The target length value may be the minimum value in the set of track path length values to be filtered.
And sixthly, determining the to-be-screened merging track information corresponding to the target length value as merging track information.
And step 1072, intercepting the lane path information sequence based on the merging information to obtain the intercepted lane path information sequence.
In some embodiments, the execution main body may intercept the lane route information sequence based on the merge information to obtain an intercepted lane route information sequence. The intercepted lane path information in the intercepted lane path information sequence may be information of lanes and paths on the lanes to be traveled after the vehicle completes lane change and lane merging. The merging information may include a target lane number, and the merging information may be extracted from lane route information having a target lane number in the lane route information sequence to the end of the lane route information sequence, and the extracted lane route information in each sequential order may be determined as an extracted lane route information sequence.
Step 1073, determining the track changing track information, the combined track information and the intercepted lane path information sequence as vehicle path information.
In some embodiments, the execution main body may determine the lane change trajectory information, the merge trajectory information, and the clipped lane route information sequence as vehicle route information. The vehicle path information may be used to characterize a path from a current position to a destination of the current vehicle.
In practice, the executing body may control the vehicle to travel along the path represented by the vehicle path information through the vehicle control command after sending the vehicle path information to the vehicle control module. The vehicle control module may be a program set for controlling the vehicle to run through a vehicle control command. The vehicle control instructions may include, but are not limited to, at least one of: an acceleration command, a deceleration command, and a steering command.
The vehicle path information generating step and the related content thereof are used as an invention point of the embodiment of the disclosure, and the technical problem mentioned in the background art that "more computing resources are occupied" is solved. The factors that cause the occupation of more computing resources are as follows: if the vehicle changes lanes midway to increase the vehicle speed, the vehicle route information needs to be regenerated according to the vehicle position and the destination by depending on vehicle navigation, so that more computing resources are occupied. If the above factors are solved, the effect of reducing the occupation of computing resources can be achieved. To achieve this, the present disclosure first plans a lane change trajectory for the vehicle. Then, the parallel track planning is carried out between the terminal point of the lane changing track and the coordinates of the start end of the parallel track lane. When the merging track planning is carried out, firstly, a driving area among all lanes is determined, then the driving area is divided into a plurality of triangular areas through boundary line segments, then a communication area which covers the terminal point of the track changing track and the coordinates of the starting end of the merging lane and consists of the triangular areas is determined, and a path track to be screened is determined on the basis of the communication area. Because the determined connected areas are multiple, multiple path tracks can be obtained, and the path track with the minimum length value in each path track is determined as the merging track. Thereby, the shortest merging path can be obtained. And then, according to the lane merging information, intercepting the lane path information sequence to obtain the shortest path from the completion of the lane merging to the destination. And finally, determining the track changing track information, the merging track information and the intercepted lane path information sequence as vehicle path information. Therefore, the sectional planning can be carried out through the corresponding vehicle path, the vehicle can travel according to the shortest path as far as possible, and the repeated planning of the path of the partial section can be avoided, so that the occupation of computing resources can be reduced. Further, the system operation efficiency can be improved.
Optionally, the track-changing track information may include end point coordinates. The executing body may further execute the following steps:
firstly, projecting the coordinates of the lane separation points included in the lane change track information to a preset vehicle body coordinate system to obtain the projected coordinates of the lane separation points. The preset vehicle body coordinate system may be a preset vehicle body coordinate system at the current moment. The lane separation point coordinates included in the lane change trajectory information can be projected from a high-precision map coordinate system to a vehicle body coordinate system through a preset conversion matrix to obtain lane separation point projection coordinates. The preset transformation matrix can be used for representing the transformation relation of the coordinates between the high-precision map coordinate system and the vehicle body coordinate system.
And secondly, acquiring the first lane line characteristic information, a third obstacle characteristic information set and vehicle positioning coordinates corresponding to the current vehicle. The first lane line feature information may be information of lane lines on both sides of a lane where the current vehicle is located, which is output by the smart camera. The third obstacle feature information set may be information of obstacle vehicles around the current vehicle, which is output by the smart camera. The third obstacle feature information in the third obstacle feature information set may include an obstacle lateral distance value and an obstacle longitudinal distance value. The vehicle positioning coordinates may be GPS positioning coordinates of the current vehicle output by the in-vehicle navigation apparatus.
And thirdly, classifying the third obstacle feature information set based on the first lane line feature information to obtain a first lane obstacle feature information set and a second lane obstacle feature information set. The first road obstacle feature information in the first road obstacle feature information set may be used to characterize an obstacle vehicle in a current lane where the current vehicle is located. The first road obstacle feature information in the first road obstacle feature information set may include a first target lateral distance value, a first target longitudinal distance value, a first target lateral relative velocity value, and a first target longitudinal relative velocity value. The second lane obstacle feature information in the second lane obstacle feature information set may be used to represent obstacle vehicles on a left adjacent lane of a lane where the current vehicle is located. The second lane obstacle feature information in the second lane obstacle feature information set may include a second target lateral distance value, a second target longitudinal distance value, a second target lateral relative velocity value, and a second target longitudinal relative velocity value. The initial obstacle feature information set can be classified through the preset classification method, so that a first lane obstacle feature information set and a second lane obstacle feature information set are obtained.
And fourthly, generating a target collision duration value based on the second lane obstacle feature information set in response to the fact that the lane separation point projection coordinates meet a first preset projection coordinate condition. The first preset projection coordinate condition may be that a horizontal axis coordinate value corresponding to the projection coordinate of the lane separation point is less than or equal to 0 and a vertical axis coordinate value corresponding to the projection coordinate of the lane separation point is greater than or equal to 0. The target collision duration value may be used to characterize how long a collision between the current vehicle and the obstacle vehicle will occur. First, for each second lane obstacle feature information in the second lane obstacle feature information set, a target obstacle total distance value may be generated based on the second lane obstacle feature information according to a triangular distance formula. Wherein, the target obstacle comprehensive distance value in the generated target obstacle comprehensive distance values can be a distance value between the current vehicle front axle center and the obstacle vehicle front axle center. And secondly, determining second lane obstacle feature information corresponding to the target obstacle comprehensive distance value with the smallest median among the target obstacle comprehensive distance values as second target obstacle feature information. Then, a quotient of a second target longitudinal distance value and a second target longitudinal relative velocity value included in the second target obstacle feature information is determined as a target longitudinal risk duration value, and a quotient of a second target lateral distance value and a second target lateral relative velocity value included in the second target obstacle feature information is determined as a target lateral risk duration value. And finally, determining the minimum value of the target longitudinal risk duration value and the target transverse risk duration value as a target collision duration value.
And fifthly, in response to the fact that the target collision time length value is smaller than the preset time length threshold value, generating first obstacle avoidance track planning information based on the vehicle positioning coordinate, the first lane line characteristic information and the first lane obstacle characteristic information set, and sending the first obstacle avoidance track planning information to the vehicle control module to control the vehicle to drive. The first obstacle avoidance trajectory planning information may be used to represent a path traveled from a current position back to a center line of a current lane after the current vehicle stops changing lanes. First obstacle avoidance track planning information can be generated through a preset path planning algorithm, the first obstacle avoidance track planning information is sent to the vehicle control module, and the vehicle is controlled to return to the center line of the current lane to continue running through a vehicle control command.
As an example, the preset path planning algorithm may include, but is not limited to, at least one of the following: a-Star path planning algorithm, lattic planer planning algorithm.
Optionally, the executing body may further perform the following steps:
and step one, responding to the condition that the projection coordinates of the lane separation points meet a second preset projection coordinate condition, and acquiring a target vehicle speed value. The second preset projection coordinate condition may be that a horizontal axis coordinate value corresponding to the projection coordinate of the lane separation point is greater than 0 and a vertical axis coordinate value is less than 0. The above-mentioned target vehicle speed value may be a speed value of the current vehicle. The target vehicle speed value can be obtained through the inertial navigation equipment.
And secondly, generating a residual distance value based on the vehicle positioning coordinate and the terminal coordinate included in the track changing track information. The remaining distance value may be a length value of a path that the current vehicle needs to travel after completing the lane change. First, a length value of a path between the vehicle positioning coordinate and the destination coordinate included in the track change trajectory information may be acquired through an interface of a high-precision map. Then, the acquired length value is determined as a remaining distance value.
And thirdly, determining the quotient of the residual distance value and the target vehicle speed value as a residual lane change time length value. The remaining lane change duration value may be a duration value still required for completing the lane change of the current vehicle.
And fourthly, generating a residual collision duration value based on the first lane obstacle feature information set. Wherein the remaining collision duration value may be used to characterize how long a collision will occur between the current vehicle and the obstacle vehicle. First, for each first road obstacle feature information in the first road obstacle feature information set, a first obstacle comprehensive distance value may be generated based on the first road obstacle feature information according to a triangle distance formula. Wherein the first obstacle integrated distance value of the generated first obstacle integrated distance values may be a distance value between a current vehicle front axle center and an obstacle vehicle front axle center. And secondly, determining the first lane obstacle feature information corresponding to the first obstacle comprehensive distance value with the smallest median among the first obstacle comprehensive distance values as first target obstacle feature information. Then, a quotient of a first target longitudinal distance value and a first target longitudinal relative velocity value included in the first target obstacle feature information is determined as a first target longitudinal risk duration value, and a quotient of a first target lateral distance value and a first target lateral relative velocity value included in the first target obstacle feature information is determined as a first target lateral risk duration value. And finally, determining the smaller value of the first target longitudinal risk duration value and the first target transverse risk duration value as a residual collision duration value.
And fifthly, determining the difference between the residual collision time length value and the residual lane change time length value as a time length difference value. The duration difference can be used for representing whether the current vehicle has collision risks in the process of continuously completing lane changing.
And sixthly, generating second obstacle avoidance track planning information based on the vehicle positioning coordinate, the first lane line characteristic information and the first lane obstacle characteristic information set in response to the fact that the time length difference value is smaller than or equal to a preset time length difference value threshold, and sending the second obstacle avoidance track planning information to the vehicle control module to control the vehicle to drive. The preset time difference threshold may be a preset threshold. For example, the preset time difference threshold may be 2 seconds. The second obstacle avoidance trajectory planning information may be used to represent a path re-planned for the vehicle to avoid the obstacle when the current vehicle has changed lanes to the left lane but the lane change process has not been completed. And generating second obstacle avoidance track planning information through the preset path planning algorithm, sending the second obstacle avoidance track planning information to the vehicle control module, and controlling the vehicle to safely drive to the central line of the current lane through a vehicle control instruction.
The above-mentioned first obstacle avoidance trajectory planning information and second obstacle avoidance trajectory planning information generating steps and their related contents are regarded as an invention point of the embodiments of the present disclosure, and the technical problem mentioned in the background art is solved, namely "the safety of vehicle driving is reduced". Factors that lead to a reduction in the safety of vehicle travel are often as follows: in the process of lane changing of the vehicle, emergencies such as emergency deceleration of the vehicle in front of a target lane or sudden acceleration of the vehicle behind the target lane may occur, and if the vehicle is driven along the originally planned path for lane changing, the vehicle may collide with an environmental vehicle, so that the driving safety of the vehicle is reduced. If the above-mentioned factors are solved, the effect of improving the safety of the vehicle running can be achieved. To achieve this effect, the present disclosure first determines whether the current vehicle has crossed the left lane line during the execution of a lane change, with reference to the coordinates of the lane separation points included in the lane change trajectory information. Then, if the current vehicle does not cross the left lane line, a target collision duration value is generated to determine whether the left lane has a collision risk. And if the left lane has collision risk, generating first obstacle avoidance track planning information for controlling the vehicle not to continue changing lanes and to continue safe driving in the current lane. And finally, if the current vehicle crosses the lane line on the left side, generating a time length difference value between the residual lane change time length value and the residual collision time length value so as to determine whether enough buffering time is available for avoiding collision after the lane change is finished. And if the buffer time does not exist after the lane change is determined, generating second obstacle avoidance track planning information for controlling the vehicle to continuously and safely run on the left lane. Therefore, in the lane changing process, the target lane changing lane can be monitored in real time, so that when collision risks exist, the current vehicle can plan the obstacle avoidance track in time to reduce the collision risks. Thus, the safety of the vehicle running can be improved.
The above embodiments of the present disclosure have the following advantages: by the vehicle path information generation method of some embodiments of the disclosure, the vehicle driving time can be shortened. Specifically, the reason why it takes a long time for the vehicle to travel is that: although the vehicle can travel along the planned route to the destination, the vehicle has to be driven below the desired speed during long-term following in order to maintain a safe distance. Based on this, the vehicle path information generating method of some embodiments of the present disclosure first obtains current vehicle perception information, lane line perception information, a first obstacle feature information set, a second obstacle feature information set, and a lane path information sequence. The information of the current vehicle, the lane line on the road where the current vehicle is located and the obstacle vehicle can be obtained, and lane change decision information can be generated according to the information for determining whether the current vehicle needs to change the lane to the left vehicle for accelerating running. And secondly, based on the lane line perception information, performing fusion processing on the first obstacle characteristic information set and the second obstacle characteristic information set to obtain a current lane obstacle characteristic information set and a left lane obstacle characteristic information set. Therefore, lane change expectation information can be generated according to the information of the obstacle vehicle on the current lane, and lane change decision information can be generated according to the information of the obstacle vehicle on the left lane. And then, generating lane change expectation information based on the current vehicle perception information and the current lane obstacle characteristic information group. And generating lane merging information based on the current vehicle perception information and the lane path information sequence in response to the fact that the lane changing expectation information meets the preset expectation lane changing condition. And in response to the fact that the merging information meets the preset distance condition, detecting and processing the obstacle characteristic information group of the left lane to obtain lane changing decision information. Therefore, whether the current vehicle needs to change the lane to the left vehicle for acceleration running can be determined according to the lane change expectation information of the current vehicle. Considering that the current vehicle needs to continuously drive towards the direction close to the destination, merging information is further generated on the basis of lane change expectation information, and lane change decision information is finally determined according to the information of the obstacle vehicle on the left lane, so that whether a lane change track is generated for the current vehicle or not is determined according to the lane change decision information for the vehicle to change the lane to the left express lane is determined subsequently, the speed of the vehicle is improved, and the time consumed by the vehicle in driving is shortened. And then, in response to the fact that the lane change decision information meets the preset lane change condition, generating lane change track information based on a preset high-precision map, the current vehicle perception information, the left lane obstacle feature information group and the lane line perception information. Therefore, the lane change decision information can be used for determining that the current vehicle is to change the lane to the left lane, and on the basis, the lane change track information is generated to be used for the current vehicle to perform the lane change operation. And finally, generating vehicle path information based on the track changing track information, the merging information and the lane path information sequence, and sending the vehicle path information to a vehicle control module to control the vehicle to drive. Wherein the generating of the vehicle path information includes: generating combined track information based on the track-changing track information and the combined track information; intercepting the lane path information sequence based on the lane merging information to obtain an intercepted lane path information sequence; and determining the track changing track information, the combined track information and the intercepted lane path information sequence as vehicle path information. Thus, the vehicle path information can be generated for the vehicle to keep running at a high speed, thereby shortening the running time period. Therefore, the vehicle path information generation method can improve the driving speed by changing the lane to the express way on the premise that the vehicle keeps a safe distance with the front vehicle. And because the vehicle after lane change still needs to drive towards the direction close to the destination, the consideration of the lane merging information is added in the determination of the lane change decision information, and the lane merging information is also considered in the later generation of the vehicle path information so as to ensure that the vehicle drives along the shortest path planned when the vehicle is at the departure place as much as possible. Thus, the running time of the vehicle can be shortened. Further, the efficiency of vehicle travel can be improved.
With further reference to fig. 2, as an implementation of the methods illustrated in the above figures, the present disclosure provides some embodiments of a vehicle route information generation apparatus, which correspond to those method embodiments illustrated in fig. 1, and which may be particularly applicable in various electronic devices.
As shown in fig. 2, the vehicle route information generation device 200 of some embodiments includes: an acquisition unit 201, a fusion processing unit 202, a first generation unit 203, a second generation unit 204, a detection processing unit 205, a third generation unit 206, and a generation and transmission unit 207. The acquiring unit 201 is configured to acquire current vehicle perception information, lane line perception information, a first obstacle feature information set, a second obstacle feature information set and a lane path information sequence; a fusion processing unit 202, configured to perform fusion processing on the first obstacle feature information set and the second obstacle feature information set based on the lane line perception information to obtain a current lane obstacle feature information set and a left lane obstacle feature information set; a first generating unit 203 configured to generate lane change expectation information based on the current vehicle sensing information and the current lane obstacle feature information set; a second generating unit 204, configured to generate merging information based on the current vehicle perception information and the lane path information sequence in response to determining that the lane change expectation information satisfies a preset expectation lane change condition; a detection processing unit 205, configured to perform detection processing on the left lane obstacle feature information set in response to determining that the merging information satisfies a preset distance condition, so as to obtain lane change decision information; a third generating unit 206 configured to generate lane change trajectory information based on a preset high-precision map, the current vehicle perception information, the left lane obstacle feature information group, and the lane line perception information in response to determining that the lane change decision information satisfies a preset lane change condition; a generating and transmitting unit 207 configured to generate vehicle path information based on the lane change track information, the merging information, and the lane path information sequence, and transmit the vehicle path information to a vehicle control module for controlling a vehicle to drive, wherein the generating of the vehicle path information includes: generating combined track information based on the track-changing track information and the combined track information; intercepting the lane path information sequence based on the lane merging information to obtain an intercepted lane path information sequence; and determining the track changing track information, the combined track information and the intercepted lane path information sequence as vehicle path information.
It will be understood that the units described in the apparatus 200 correspond to the various steps in the method described with reference to fig. 1. Thus, the operations, features and advantages described above for the method are also applicable to the apparatus 200 and the units included therein, and are not described herein again.
With further reference to fig. 3, a schematic structural diagram of an electronic device 300 suitable for use in implementing some embodiments of the present disclosure is shown. The electronic device shown in fig. 3 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 3, the electronic device 300 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 301 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM) 302 or a program loaded from a storage means 308 into a Random Access Memory (RAM) 303. In the RAM 303, various programs and data necessary for the operation of the electronic apparatus 300 are also stored. The processing device 301, the ROM 302, and the RAM 303 are connected to each other via a bus 304. An input/output (I/O) interface 305 is also connected to bus 304.
Generally, the following devices may be connected to the I/O interface 305: input devices 306 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 307 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage devices 308 including, for example, magnetic tape, hard disk, etc.; and a communication device 309. The communication means 309 may allow the electronic device 300 to communicate wirelessly or by wire with other devices to exchange data. While fig. 3 illustrates an electronic device 300 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 3 may represent one device or may represent multiple devices, as desired.
In particular, according to some embodiments of the present disclosure, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In some such embodiments, the computer program may be downloaded and installed from a network through the communication device 309, or installed from the storage device 308, or installed from the ROM 302. The computer program, when executed by the processing apparatus 301, performs the above-described functions defined in the methods of some embodiments of the present disclosure.
It should be noted that the computer readable medium described above in some embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the apparatus; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring current vehicle perception information, lane line perception information, a first obstacle feature information set, a second obstacle feature information set and a lane path information sequence; based on the lane line perception information, performing fusion processing on the first obstacle feature information set and the second obstacle feature information set to obtain a current lane obstacle feature information group and a left lane obstacle feature information group; generating lane change expectation information based on the current vehicle perception information and the current lane obstacle feature information group; generating lane merging information based on the current vehicle perception information and the lane path information sequence in response to determining that the lane changing expectation information meets a preset expectation lane changing condition; in response to the fact that the merging information meets the preset distance condition, detecting and processing the obstacle feature information group of the left lane to obtain lane changing decision information; generating lane change track information based on a preset high-precision map, the current vehicle sensing information, the left lane obstacle feature information group and the lane line sensing information in response to determining that the lane change decision information meets a preset lane change condition; generating vehicle path information based on the track changing track information, the merging information and the lane path information sequence, and sending the vehicle path information to a vehicle control module for controlling a vehicle to drive, wherein the generating the vehicle path information comprises: generating combined track information based on the track-changing track information and the combined track information; intercepting the lane path information sequence based on the lane merging information to obtain an intercepted lane path information sequence; and determining the track changing track information, the combined track information and the intercepted lane path information sequence as vehicle path information.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in some embodiments of the present disclosure may be implemented by software, and may also be implemented by hardware. The described units may also be provided in a processor, and may be described as: a processor includes an acquisition unit, a fusion processing unit, a first generation unit, a second generation unit, a detection processing unit, a third generation unit, and a generation and transmission unit. Here, the names of these units do not constitute a limitation to the unit itself in some cases, and for example, the acquisition unit may also be described as a "unit that acquires the current vehicle perception information, the lane line perception information, the first obstacle feature information set, the second obstacle feature information set, and the lane path information sequence".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems on a chip (SOCs), complex Programmable Logic Devices (CPLDs), and the like.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the inventive concept as defined above. For example, the above features and (but not limited to) technical features with similar functions disclosed in the embodiments of the present disclosure are mutually replaced to form the technical solution.

Claims (6)

1. A vehicle path information generation method, comprising:
acquiring current vehicle perception information, lane line perception information, a first obstacle feature information set, a second obstacle feature information set and a lane path information sequence;
based on the lane line perception information, carrying out fusion processing on the first obstacle characteristic information set and the second obstacle characteristic information set to obtain a current lane obstacle characteristic information group and a left lane obstacle characteristic information group;
generating lane change expectation information based on the current vehicle perception information and the current lane obstacle feature information group;
in response to determining that the lane change expectation information meets a preset expectation lane change condition, generating lane merging information based on the current vehicle perception information and the lane path information sequence;
in response to the fact that the lane merging information meets the preset distance condition, detecting and processing the obstacle feature information group of the left lane to obtain lane changing decision information;
generating lane change track information based on a preset high-precision map, the current vehicle perception information, the left lane obstacle feature information group and the lane line perception information in response to determining that the lane change decision information meets a preset lane change condition;
generating vehicle path information based on the track changing track information, the merging information and the lane path information sequence, and sending the vehicle path information to a vehicle control module for controlling a vehicle to drive, wherein the generating the vehicle path information comprises:
generating combined track information based on the track changing track information and the combined track information;
intercepting the lane path information sequence based on the lane merging information to obtain an intercepted lane path information sequence;
determining the track changing track information, the track merging track information and the intercepted lane path information sequence as vehicle path information;
wherein, said based on the lane line perception information, fusing the first obstacle feature information set and the second obstacle feature information set to obtain a current lane obstacle feature information group and a left lane obstacle feature information group, including:
performing redundancy removal processing on each first obstacle feature information in the first obstacle feature information set to obtain a redundancy-removed first obstacle feature information set;
performing redundancy removal processing on each second obstacle characteristic information in the second obstacle characteristic information set to obtain a redundancy-removed second obstacle characteristic information set;
for each redundancy-removed first obstacle feature information in the redundancy-removed first obstacle feature information set, determining redundancy-removed second obstacle feature information matched with the redundancy-removed first obstacle feature information in the redundancy-removed second obstacle feature information set as obstacle feature information to be deleted;
deleting the redundancy-removed second obstacle feature information which is matched with each obtained obstacle feature information to be deleted in the redundancy-removed second obstacle feature information set to obtain a deleted second obstacle feature information set;
generating a target obstacle feature information set based on the redundancy-removed first obstacle feature information set and the deleted second obstacle feature information set;
and classifying the target obstacle feature information set based on the lane line perception information to obtain a current lane obstacle feature information group and a left lane obstacle feature information group.
2. The method according to claim 1, wherein the current vehicle perception information comprises a vehicle speed value, each current lane obstacle feature information of the current lane obstacle feature information set comprising a longitudinal distance value; and
generating lane change expectation information based on the current vehicle perception information and the current lane obstacle feature information group, wherein the generating of the lane change expectation information comprises the following steps:
selecting current lane obstacle feature information meeting a preset front direction condition from the current lane obstacle feature information group as front obstacle vehicle feature information to obtain a front obstacle vehicle feature information group;
selecting front obstacle vehicle characteristic information meeting a preset longitudinal distance condition from the front obstacle vehicle characteristic information group as target obstacle vehicle characteristic information;
acquiring a speed threshold corresponding to a longitudinal distance value included in the target obstacle vehicle characteristic information;
in response to determining that the vehicle speed value included in the current vehicle perception information is less than or equal to the speed threshold value, determining whether a preset expected speed value is greater than the speed threshold value;
and in response to determining that the preset expected speed value is greater than the speed threshold value, determining the preset expected speed value and a preset left lane changing identifier as lane changing expected information.
3. The method of claim 2, wherein the method further comprises:
in response to determining that the preset expected speed value is less than or equal to the speed threshold value, determining the speed threshold value and a preset lane changing-free identifier as lane changing expected information.
4. A vehicle path information generating device comprising:
an acquisition unit configured to acquire current vehicle perception information, lane line perception information, a first obstacle feature information set, a second obstacle feature information set, and a lane path information sequence;
a fusion processing unit configured to perform fusion processing on the first obstacle feature information set and the second obstacle feature information set based on the lane line perception information to obtain a current lane obstacle feature information group and a left lane obstacle feature information group;
a first generating unit configured to generate lane change expectation information based on the current vehicle perception information and the current lane obstacle feature information group;
a second generating unit configured to generate lane merging information based on the current vehicle perception information and the lane path information sequence in response to determining that the lane change expectation information satisfies a preset expectation lane change condition;
the detection processing unit is configured to perform detection processing on the obstacle characteristic information group of the left lane in response to the fact that the lane merging information meets the preset distance condition, and obtain lane changing decision information;
a third generating unit configured to generate lane change trajectory information based on a preset high-precision map, the current vehicle perception information, the left lane obstacle feature information group, and the lane line perception information in response to determining that the lane change decision information satisfies a preset lane change condition;
a generating and transmitting unit configured to generate vehicle path information based on the lane change track information, the merging information and the lane path information sequence, and transmit the vehicle path information to a vehicle control module for controlling a vehicle to drive, wherein the generating the vehicle path information includes:
generating combined track information based on the track changing track information and the combined track information;
intercepting the lane path information sequence based on the lane merging information to obtain an intercepted lane path information sequence;
determining the track changing track information, the track merging track information and the intercepted lane path information sequence as vehicle path information;
the fusion processing is performed on the first obstacle feature information set and the second obstacle feature information set based on the lane line perception information to obtain a current lane obstacle feature information group and a left lane obstacle feature information group, and the fusion processing includes:
performing redundancy removal processing on each first obstacle feature information in the first obstacle feature information set to obtain a redundancy-removed first obstacle feature information set;
performing redundancy removal processing on each second obstacle characteristic information in the second obstacle characteristic information set to obtain a redundancy-removed second obstacle characteristic information set;
for each redundancy-removed first obstacle feature information in the redundancy-removed first obstacle feature information set, determining redundancy-removed second obstacle feature information matched with the redundancy-removed first obstacle feature information in the redundancy-removed second obstacle feature information set as obstacle feature information to be deleted;
deleting the redundancy-removed second obstacle feature information which is matched with each obtained obstacle feature information to be deleted in the redundancy-removed second obstacle feature information set to obtain a deleted second obstacle feature information set;
generating a target obstacle feature information set based on the redundancy-removed first obstacle feature information set and the deleted second obstacle feature information set;
and classifying the target obstacle feature information set based on the lane line perception information to obtain a current lane obstacle feature information group and a left lane obstacle feature information group.
5. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-3.
6. A computer-readable medium, on which a computer program is stored, wherein the computer program, when being executed by a processor, carries out the method according to any one of claims 1-3.
CN202211568606.6A 2022-12-08 2022-12-08 Vehicle path information generation method, device, equipment and computer readable medium Active CN115657684B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211568606.6A CN115657684B (en) 2022-12-08 2022-12-08 Vehicle path information generation method, device, equipment and computer readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211568606.6A CN115657684B (en) 2022-12-08 2022-12-08 Vehicle path information generation method, device, equipment and computer readable medium

Publications (2)

Publication Number Publication Date
CN115657684A CN115657684A (en) 2023-01-31
CN115657684B true CN115657684B (en) 2023-03-28

Family

ID=85020141

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211568606.6A Active CN115657684B (en) 2022-12-08 2022-12-08 Vehicle path information generation method, device, equipment and computer readable medium

Country Status (1)

Country Link
CN (1) CN115657684B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116299474B (en) * 2023-05-23 2023-09-12 禾多科技(北京)有限公司 Integrated radar device and vehicle obstacle avoidance method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110333714A (en) * 2019-04-09 2019-10-15 武汉理工大学 A kind of pilotless automobile paths planning method and device
DE102019120160A1 (en) * 2019-07-25 2021-01-28 Daimler Ag Enabling the safe removal of obstacles on a road in front of an automated vehicle
CN114021840A (en) * 2021-11-12 2022-02-08 京东鲲鹏(江苏)科技有限公司 Channel switching strategy generation method and device, computer storage medium and electronic equipment

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6558239B2 (en) * 2015-12-22 2019-08-14 アイシン・エィ・ダブリュ株式会社 Automatic driving support system, automatic driving support method, and computer program
CN109987092B (en) * 2017-12-28 2020-10-30 郑州宇通客车股份有限公司 Method for determining vehicle obstacle avoidance and lane change time and method for controlling obstacle avoidance and lane change
CN111661055B (en) * 2020-06-17 2023-06-06 清华大学苏州汽车研究院(吴江) Lane changing control method and system for automatic driving vehicle
CN112020014B (en) * 2020-08-24 2022-08-19 中国第一汽车股份有限公司 Lane change track planning method, device, server and storage medium
CN112835030A (en) * 2020-12-30 2021-05-25 深圳承泰科技有限公司 Data fusion method and device for obstacle target and intelligent automobile
CN113920735B (en) * 2021-10-21 2022-11-15 中国第一汽车股份有限公司 Information fusion method and device, electronic equipment and storage medium
CN114357814B (en) * 2022-03-21 2022-05-31 禾多科技(北京)有限公司 Automatic driving simulation test method, device, equipment and computer readable medium
CN115123216A (en) * 2022-08-05 2022-09-30 国汽智控(北京)科技有限公司 Vehicle obstacle avoidance method and device, vehicle, equipment and storage medium
CN115179949B (en) * 2022-09-13 2022-11-29 毫末智行科技有限公司 Vehicle speed-changing control method, device, equipment and storage medium
CN115339453B (en) * 2022-10-19 2022-12-23 禾多科技(北京)有限公司 Vehicle lane change decision information generation method, device, equipment and computer medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110333714A (en) * 2019-04-09 2019-10-15 武汉理工大学 A kind of pilotless automobile paths planning method and device
DE102019120160A1 (en) * 2019-07-25 2021-01-28 Daimler Ag Enabling the safe removal of obstacles on a road in front of an automated vehicle
CN114021840A (en) * 2021-11-12 2022-02-08 京东鲲鹏(江苏)科技有限公司 Channel switching strategy generation method and device, computer storage medium and electronic equipment

Also Published As

Publication number Publication date
CN115657684A (en) 2023-01-31

Similar Documents

Publication Publication Date Title
CN110160552B (en) Navigation information determination method, device, equipment and storage medium
EP4009300A1 (en) Vehicle automatic control method and lane change intention prediction network training method
CN109697875B (en) Method and device for planning driving track
CN110654381B (en) Method and device for controlling a vehicle
CN112590813B (en) Method, device, electronic device and medium for generating information of automatic driving vehicle
CN109916414B (en) Map matching method, apparatus, device and medium
CN111422204A (en) Automatic driving vehicle passing judgment method and related equipment
CN112700636B (en) Method and apparatus for updating information
CN116022130B (en) Vehicle parking method, device, electronic equipment and computer readable medium
CN115657684B (en) Vehicle path information generation method, device, equipment and computer readable medium
US10393534B2 (en) Determining speed information
CN115761702A (en) Vehicle track generation method and device, electronic equipment and computer readable medium
CN113033925B (en) Apparatus, electronic device, and medium for controlling travel of autonomous vehicle
CN110514217B (en) Method and device for assisting automatic driving
CN110726414B (en) Method and apparatus for outputting information
CN111688717B (en) Method and device for controlling vehicle traffic
CN116279596A (en) Vehicle control method, apparatus, electronic device, and computer-readable medium
CN115583254A (en) Path planning method, device and equipment and automatic driving vehicle
CN115372020A (en) Automatic driving vehicle test method, device, electronic equipment and medium
CN112509314B (en) Intersection vehicle speed guiding method and device
CN113008246B (en) Map matching method and device
CN115824233B (en) Travel road information matching method, apparatus, device and computer readable medium
CN116086477B (en) Local path information generation method, device, equipment and computer readable medium
CN113500994B (en) Vehicle speed limiting method and device, electronic equipment and storage medium
CN115294764B (en) Crosswalk area determination method, crosswalk area determination device, crosswalk area determination equipment and automatic driving vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 201, 202, 301, No. 56-4 Fenghuang South Road, Huadu District, Guangzhou City, Guangdong Province, 510806

Patentee after: Heduo Technology (Guangzhou) Co.,Ltd.

Address before: 100099 101-15, 3rd floor, building 9, yard 55, zique Road, Haidian District, Beijing

Patentee before: HOLOMATIC TECHNOLOGY (BEIJING) Co.,Ltd.