CN116353627A - Vehicle planning control method and device, electronic equipment and vehicle - Google Patents

Vehicle planning control method and device, electronic equipment and vehicle Download PDF

Info

Publication number
CN116353627A
CN116353627A CN202310179794.1A CN202310179794A CN116353627A CN 116353627 A CN116353627 A CN 116353627A CN 202310179794 A CN202310179794 A CN 202310179794A CN 116353627 A CN116353627 A CN 116353627A
Authority
CN
China
Prior art keywords
lane line
vehicle
confidence
map
precision map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310179794.1A
Other languages
Chinese (zh)
Inventor
杨凯
胡江滔
曹光植
姬猛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Yunji Yuedong Intelligent Technology Development Co ltd
Original Assignee
Shanghai Yunji Yuedong Intelligent Technology Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Yunji Yuedong Intelligent Technology Development Co ltd filed Critical Shanghai Yunji Yuedong Intelligent Technology Development Co ltd
Priority to CN202310179794.1A priority Critical patent/CN116353627A/en
Publication of CN116353627A publication Critical patent/CN116353627A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • G01C21/3815Road data
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a vehicle planning control method and device, electronic equipment and a vehicle, wherein the method comprises the following steps: at least one perceived lane line of a target road section in front of a vehicle and a map lane line corresponding to the target road section in a high-precision map are obtained, and the perceived lane line is generated by utilizing perceived data of the vehicle; determining fusion matching states among the at least one sensing lane line and between the at least one sensing lane line and the map lane line respectively; determining the confidence coefficient of the high-precision map and the confidence coefficient of the perception data according to the fusion matching state; and determining a planning control strategy for the vehicle based on the confidence level of the high-precision map and the confidence level of the perception data. According to the automatic driving planning control method and device, automatic driving planning control is achieved, positioning accuracy is improved, positioning jump is avoided, and automatic driving safety is improved.

Description

Vehicle planning control method and device, electronic equipment and vehicle
Technical Field
The present invention relates to the field of vehicle planning control, and in particular, to a vehicle planning control method and apparatus, an electronic device, and a vehicle.
Background
Currently, in positioning of automatic driving, a multi-sensor fusion mode is generally adopted to fuse multiple data in an RTK (Real-time kinematic) carrier phase difference technology), an IMU (Inertial measurementunit, an inertial measurement unit), a vector map and a point cloud map to obtain positioning data of an automatic vehicle. The driving environment of the automatic driving vehicle is very complex, and different fusion schemes can have an unprocessed edge scene.
However, the data of the multiple sensors and the high-precision map data are often in different environments, and the data may have errors or large errors. For example, in a tunnel scenario: the positioning signal has larger error; the point cloud and the map feature information are not obvious, and accurate positioning is difficult to carry out through matching; through the estimation of the IMU and the wheel speed meter, the problems of zero drift of a sensor, vehicle parameter change and other errors can also exist. For example, since the high-precision map has a production cycle, there is a problem that the automatic driving vehicle uses the high-precision map to update map data in a timely manner.
The driving planning in automatic driving depends on positioning data, and according to different environmental conditions, sensor data or high-precision map data used by the positioning data often have errors or larger errors, so that the positioning data cannot meet high-precision positioning, the problem of positioning jump easily occurs, and the safety of an automatic driving system is easily influenced by the problem of the positioning data.
Therefore, how to realize automatic driving planning control so as to improve positioning accuracy, avoid positioning jump and improve automatic driving safety is a technical problem to be solved in the field.
Disclosure of Invention
In order to overcome the defects of the related art, the invention provides a vehicle planning control method and device, electronic equipment and a vehicle, so as to realize automatic driving planning control, improve positioning accuracy, avoid positioning jump and improve automatic driving safety.
According to an aspect of the present invention, there is provided a vehicle planning control method including:
at least one perceived lane line of a target road section in front of a vehicle and a map lane line corresponding to the target road section in a high-precision map are obtained, and the perceived lane line is generated by utilizing perceived data of the vehicle;
determining fusion matching states among the at least one sensing lane line and between the at least one sensing lane line and the map lane line respectively;
determining the confidence coefficient of the high-precision map and the confidence coefficient of the perception data according to the fusion matching state;
and determining a planning control strategy for the vehicle based on the confidence level of the high-precision map and the confidence level of the perception data.
In some embodiments of the present application, the fusion matching state includes a first degree of matching between the at least one perceived lane line, and a second degree of matching between the at least one perceived lane line and the map lane line, respectively.
In some embodiments of the present application, the determining the confidence level of the high-precision map and the perception data according to the fusion matching state includes:
under the condition that the first matching degree meets a first preset condition, determining that the confidence degree of the perception data is larger than a first preset threshold value;
and/or determining that the confidence coefficient of the high-precision map is larger than a second preset threshold value under the condition that the second matching degree meets a second preset condition.
In some embodiments of the present application, the determining that the confidence level of the high-precision map is greater than a second preset threshold in the case that the second matching level is determined to satisfy a second preset condition includes:
and determining that the confidence coefficient of the high-precision map is larger than a second preset threshold value under the condition that at least one of the second matching degrees is larger than a third preset threshold value.
In some embodiments of the present application, the at least one perceived lane line includes a visual lane line and a laser point cloud lane line, wherein the visual lane line is generated using perceived data acquired by a visual sensor, the laser point cloud lane line is generated using the acquired perceived data by a laser radar, and correspondingly, the determining that the confidence of the high-precision map is greater than a second preset threshold value if at least one of the second matching degrees is greater than a third preset threshold value includes:
Determining the confidence coefficient of the high-precision map as a first confidence coefficient under the condition that the second matching degree between the visual lane line and the map lane line is larger than a third preset threshold value;
determining the confidence coefficient of the high-precision map as a second confidence coefficient under the condition that the second matching degree between the laser point cloud lane line and the map lane line is larger than a third preset threshold value;
the first confidence coefficient and the second confidence coefficient are both larger than a second preset threshold value, and the first confidence coefficient is set to be larger than the second confidence coefficient.
In some embodiments of the present application, the determining a planning control strategy for the own vehicle based on the high-precision map and the confidence of the perceived data includes:
and determining a planning control strategy for the own vehicle based on the perception data under the condition that the confidence coefficient of the perception data is determined to be larger than a first preset threshold value and the confidence coefficient of the high-precision map is smaller than or equal to a second preset threshold value.
In some embodiments of the present application, the determining a planning control strategy for the own vehicle based on the high-precision map and the confidence of the perceived data includes:
and determining a planning control strategy for the vehicle based on the high-precision map under the condition that the confidence coefficient of the high-precision map is larger than a second preset threshold value and the confidence coefficient of the perception data is smaller than or equal to a first preset threshold value.
According to yet another aspect of the present application, there is also provided a vehicle planning control including:
the system comprises a lane line acquisition module, a control module and a control module, wherein the lane line acquisition module is configured to acquire at least one sensing lane line of a target road section in front of a vehicle and a map lane line corresponding to the target road section in a high-precision map, and the sensing lane line is generated by using sensing data of the vehicle;
a fusion matching state determining module configured to determine fusion matching states between the at least one perceived lane line and the map lane line, respectively;
the confidence determining module is configured to determine the confidence of the high-precision map and the confidence of the perception data according to the fusion matching state;
the confidence level of the high-precision map and the confidence level of the perception data are used for determining a planning control strategy for the vehicle.
According to still another aspect of the present application, there is also provided an electronic apparatus including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to implement the above-described method when executing the instructions.
According to still another aspect of the present application, there is also provided a vehicle including:
A perception sensor configured to acquire perception data of the own vehicle;
a perception fusion module configured to:
at least one perceived lane line of a target road section in front of a vehicle and a map lane line corresponding to the target road section in a high-precision map are obtained, and the perceived lane line is generated by utilizing perceived data of the vehicle;
determining fusion matching states among the at least one sensing lane line and between the at least one sensing lane line and the map lane line respectively;
determining the confidence coefficient of the high-precision map and the confidence coefficient of the perception data according to the fusion matching state;
and the planning control module is configured to determine a planning control strategy for the vehicle based on the confidence of the high-precision map and the confidence of the perception data.
Compared with the prior art, the invention has the advantages that:
the confidence of the high-precision map and the confidence of the perceived data are determined through fusion matching states among at least one perceived lane line and between at least one perceived lane line and the map lane line respectively, so that the planning control strategy of the vehicle is determined according to the confidence of the high-precision map and the confidence of the perceived data, the high-precision map data and/or the perceived data with higher confidence can be used in the planning control strategy of the vehicle, and the problem of positioning errors of the vehicle due to data errors or data errors of the high-precision map data and the perceived data is avoided, thereby improving the safety of automatic driving planning.
Drawings
The above and other features and advantages of the present invention will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings.
Fig. 1 shows a flowchart of a vehicle planning control method according to an embodiment of the present invention.
Fig. 2 shows a block diagram of a vehicle planning control device according to an embodiment of the present invention.
Fig. 3 shows a block diagram of a vehicle according to an embodiment of the invention.
Fig. 4 schematically illustrates a computer-readable storage medium according to an exemplary embodiment of the present invention.
Fig. 5 schematically illustrates a schematic diagram of an electronic device in an exemplary embodiment of the invention.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present invention and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
The flow diagrams depicted in the figures are exemplary only and not necessarily all steps are included. For example, some steps may be decomposed, and some steps may be combined or partially combined, so that the order of actual execution may be changed according to actual situations.
Fig. 1 shows a flowchart of a vehicle planning control method according to an embodiment of the present invention. The vehicle planning control method provided by the application comprises the following steps:
step S110: at least one perceived lane line acquired from a target road section in front of a vehicle and a map lane line corresponding to the target road section in a high-precision map, wherein the perceived lane line is generated by using perceived data of the vehicle.
Step S120: and determining fusion matching states among the at least one sensing lane line and between the at least one sensing lane line and the map lane line respectively.
Step S130: and determining the confidence coefficient of the high-precision map and the confidence coefficient of the perception data according to the fusion matching state.
Step S140: and determining a planning control strategy for the vehicle based on the confidence level of the high-precision map and the confidence level of the perception data.
In the vehicle planning control method provided by the application, the confidence coefficient of the high-precision map and the confidence coefficient of the perception data are determined through the fusion matching states among at least one perception lane line and between at least one perception lane line and the map lane line respectively, so that the planning control strategy of the vehicle is determined according to the confidence coefficient of the high-precision map and the confidence coefficient of the perception data, and therefore the high-precision map data and/or the perception data with higher confidence coefficient can be used in the planning control strategy of the vehicle, the problem that the positioning error of the vehicle is caused by data errors or data errors of the high-precision map data and the perception data is avoided, and the automatic driving planning safety is improved.
In particular, the sensory data of the vehicle may include visual sensor data as well as radar sensor data, and other sensory data are within the scope of the present application. The perceived lane line can be detected and obtained according to the perceived data. For example, the vision sensor obtains vision sensor data, and from lane line detection of the vision sensor data, a vision lane line may be obtained. Specifically, lane line detection of the vision sensor data may be implemented by various artificial intelligent models or image processing algorithms, which is not limited in this application. For example, laser radar may obtain laser point cloud sensor data, and laser point cloud lane lines may be obtained from lane line detection of the laser point cloud sensor data. Specifically, lane line detection of the laser point cloud sensor data may be achieved by matching feature extraction of the laser point cloud with a preset lane line feature, which is not limited in this application. Of course, in other embodiments, the sensing data may also be obtained by a sensing sensor such as a millimeter wave radar, and the sensing data may be detected to obtain a sensing lane line, for example, the millimeter wave radar may obtain measurement data such as echo intensity, distance, angle, and the like, and the visual lane line may be detected according to the measurement data.
Specifically, the front target section of the vehicle may be set as a section within a set area in front of the vehicle as needed. For example, the front target section may be a section 0.5 m to 5 m in front of the vehicle, which is not limited in this application. The sensing data is acquired based on the sensing sensor, and the sensing sensor is arranged on the vehicle, so that the sensing lane line of the front target road section of the vehicle can be detected according to the distance between each feature in the sensing data and the vehicle. In some embodiments, the position of the vehicle in the high-precision map may be determined according to a positioning signal of the vehicle, such as a GPS (global positioning system) positioning signal, a beidou positioning signal, and the like, so as to obtain a map lane line of a target road section in front of the vehicle in the high-precision map according to the determined position of the vehicle. In other embodiments, the own vehicle position of the own vehicle in the high-precision map may be determined based on, for example, the inertial measurement unit and the vehicle travel information, so that the map lane line of the own vehicle front target section in the high-precision map is acquired based on the determined own vehicle position. In still other embodiments, map lane lines corresponding to perceived lane lines may be determined based on feature matching of the perceived data with the high-precision map. The method for acquiring the map lane lines can be realized in a plurality of different ways, and is not repeated here.
Specifically, the fusion matching state may include one or more of a matching degree between lane lines, a fusion weight between lane lines, and a lane line confidence provided by lane line detection.
The matching degree between the lane lines may include a distance between the lane lines. For example, after detecting the perceived lane line, a mathematical expression of the perceived lane line, such as a curve equation, may be determined using curve fitting or the like. Based on this, in the case of determining the degree of matching between two lane lines, the euclidean distance between the two lane lines may be calculated from the mathematical expression of the lane lines, the shorter the euclidean distance, the higher the degree of matching of the two lane lines. Of course, in other embodiments, the matching degree may also be determined by a curve matching algorithm such as the nearest point search method (IterativeClosestPoint, ICP), which is not limited herein. Therefore, the fusion matching state is determined at least based on the matching degree between the lane lines, on one hand, only the same target object obtained by different data sources (namely the perception data and the high-precision map) is required to be subjected to data processing, a large amount of perception data and high-precision map data are not required to be compared, and the data processing efficiency is improved; on the other hand, the data association relation (the matching degree between the sensing data and the high-precision map) of different data sources (namely the sensing data and the high-precision map) can be taken into the determination of the confidence degree of the sensing data and the high-precision map and the determination of the planning control strategy, and the accuracy of the obtained confidence degree of the sensing data and the high-precision map can be improved, so that the automatic driving planning safety is improved.
In automatic driving, after the perceived lane line and the map lane line are obtained, the lane lines may be fused, so that automatic driving of the vehicle can be controlled based on the fused lane lines. Thus, in some embodiments, the fusion weight of the lane lines and the map lane lines may be perceived as a fusion match state when the lane lines are fused. And the fusion weight is used for indicating the importance degree of the perceived lane line and the map lane line when the perceived lane line and the map lane line are fused. Specifically, in the automatic driving control process, it is necessary to use lane line information (such as the position and shape of a lane line) to perform an automatic driving operation such as lane line control, lane change, and the like. The lane line information may be obtained based on a weighted fusion of the perceived lane line and the map lane line. For example, the corresponding lane line feature points can be determined by sensing a fitted curve of the lane line and the map lane line, and the coordinates of the corresponding feature points in the lane line information can be obtained by performing weighted summation based on the coordinates of the lane line feature points of the corresponding sensed lane line and the coordinates of the lane line feature points of the corresponding map lane line, and the required lane line information can be obtained by combining the feature points. The fusion weights can comprise weights used when the lane line characteristic points of the perceived lane lines and the lane line characteristic points of the map lane lines are fused. The fusion weights of the perceived lane lines may be calculated based on the confidence of the perceived lane lines. The confidence of the perceived lane line is used to represent the degree of confidence of the perceived lane line obtained. In some implementations, the perceived lane line may be predicted output based on an artificial intelligence model, such as a neural network model, which may output the perceived lane line and its probability, which may be used as a confidence level for the perceived lane. The confidence of the map lane is used to represent the confidence level of the obtained map lane. In some implementations, the confidence of the map lane lines may be calculated from the confidence of the positioning signals of the vehicle (e.g., the confidence of the positioning signals of the vehicle may be taken as the confidence of the map lane lines). The confidence of the positioning signal can be provided by the positioning module, and can be obtained by prediction of a trained artificial intelligence model based on the position of the vehicle and environmental information. The present application is not limited in this regard.
Therefore, the fusion matching state is determined at least based on the fusion weight among the lane lines, on one hand, only the same target object obtained by different data sources (namely the perception data and the high-precision map) is required to be subjected to data processing, a large amount of perception data and high-precision map data are not required to be compared, and the data processing efficiency is improved; on the other hand, because the step of fusing the lane lines exists in the automatic driving, the fusion weight when fusing the lane lines can be multiplexed, the fused state is not required to be recalculated, and the confidence degree determining efficiency of the perception data and the high-precision map is improved; on the other hand, the importance degree of fusion of different data sources (namely the perception data and the high-precision map) can be incorporated into the determination of the confidence degree of the perception data and the high-precision map and the determination of the planning control strategy, and the accuracy of the obtained confidence degree of the perception data and the high-precision map can be improved, so that the automatic driving planning safety is improved.
In other embodiments, the detection algorithm for partially sensing the lane lines may provide the confidence level of the detected lane lines by itself, so that the confidence level provided by the detection algorithm may be directly used as the fusion matching state. The confidence of the perceived lane line is used to represent the degree of confidence of the perceived lane line obtained. In some implementations, the detection algorithm that perceives the lane lines may be an artificial intelligence model such as a neural network model. The perceived lane line may be predicted based on an artificial intelligence model, such as a neural network model, which may output the perceived lane line and its probability, which may be used as a confidence level for the perceived lane. The confidence of the map lane is used to represent the confidence level of the obtained map lane. In some implementations, the confidence of the map lane lines may be calculated from the confidence of the positioning signals of the vehicle (e.g., the confidence of the positioning signals of the vehicle may be taken as the confidence of the map lane lines). The confidence of the positioning signal can be provided by the positioning module, and can be obtained by prediction of a trained artificial intelligence model based on the position of the vehicle and environmental information.
Therefore, the fusion matching state is determined at least based on the confidence of the lane lines, on one hand, only the same target object obtained by different data sources (namely the perception data and the high-precision map) is required to be subjected to data processing, a large amount of perception data and high-precision map data are not required to be compared, and the data processing efficiency is improved; on the other hand, the confidence coefficient of the lane line can be directly output by related modules (such as a positioning module and a lane line detection module) generally, so that the fused state does not need to be recalculated, and the confidence coefficient determining efficiency of the perception data and the high-precision map is improved.
The present application may implement a plurality of different fusion matching states, which are not described herein.
In some embodiments, the fused matching state may include a first degree of matching between the at least one perceived lane line and a second degree of matching between the at least one perceived lane line and the map lane line, respectively. The first degree of matching between perceived lane lines and the second degree of matching between at least one perceived lane line and the map lane line, respectively, can be used to determine the confidence of high-precision maps and perceived data. Specifically, the degree of matching between lane lines may be calculated based on position data of the lane lines, shape data of the lane lines, and the like. For example, the smaller the error of the position data of the lane line, the higher the matching degree. In some embodiments, the error in the position data of the lane lines may be calculated from the average Euclidean distance of the corresponding points between the lane lines. The smaller the average Euclidean distance of the corresponding feature points between the lane lines, the higher the matching degree between the lane lines. For another example, the mathematical expression of the lane lines may be determined by curve fitting or the like, such as a curve equation, and the matching degree between the lane lines may be determined by a curve matching algorithm such as a nearest point search method. The present application may implement a plurality of different ways of calculating the matching degree, which will not be described herein.
Therefore, the first matching degree between at least one sensing lane line and the second matching degree between the at least one sensing lane line and the map lane line respectively can be used for taking the data association relation (the matching degree between the lane lines) between different data sources (namely, between a plurality of sensing data and between the sensing data and the high-precision map) into the determination of the confidence degree of the sensing data and the high-precision map and the determination of the planning control strategy, and the accuracy of the obtained confidence degree of the sensing data and the high-precision map can be improved, so that the automatic driving planning safety is improved.
In some embodiments, the step S130 may include: and under the condition that the first matching degree meets a first preset condition, determining that the confidence degree of the perception data is larger than a first preset threshold value. In some embodiments, the step S130 may include: and under the condition that the second matching degree meets a second preset condition, determining that the confidence degree of the high-precision map is larger than a second preset threshold value. In some embodiments, the step S130 may include: under the condition that the first matching degree meets a first preset condition, determining that the confidence degree of the perception data is larger than a first preset threshold value; and determining that the confidence coefficient of the high-precision map is larger than a second preset threshold value under the condition that the second matching degree meets a second preset condition. Specifically, the first preset condition and the second preset condition may be set as needed. For example, the first preset condition may include: the first matching degree is larger than a preset threshold value, so that when the matching degree between the sensing lane lines is higher through a first preset condition, the confidence degree of the sensing data is higher, and the confidence degree of the sensing data can be larger than the first preset threshold value. In a specific implementation, assuming that the preset threshold is 0.8 and the first preset threshold is 0.8, when the matching degree between the visual lane line and the laser point cloud lane line is 0.9 and greater than the preset threshold, the confidence degree of the perceived data may be greater than the first preset threshold by 0.8, for example, the confidence degree of the perceived data may be 0.9. For example, the second preset condition may include: at least one of the second matching degrees is larger than a third preset threshold value, so that when the matching degree of at least one sensing lane line and the map lane line is higher, the confidence degree of the high-precision map is higher through a second preset condition, and the confidence degree of the high-precision map can be larger than the second preset threshold value. In a specific implementation, assuming that the third preset threshold is 0.8 and the second preset threshold is 0.8, when the matching degree between the visual lane line and the map lane line is 0.9 and is greater than the third preset threshold, the confidence degree of the high-precision map may be greater than the second preset threshold by 0.8, for example, the confidence degree of the high-precision map may be 0.9; when the matching degree between the laser point cloud lane line and the map lane line is 0.9 and is greater than the third preset threshold, the confidence degree of the high-precision map may be greater than the second preset threshold by 0.8, for example, the confidence degree of the high-precision map may be 0.9. Therefore, through the different embodiments, on one hand, the confidence degree of the perception data and/or the confidence degree of the high-precision map can be determined, so that the method and the device can adapt to different automatic driving planning control conditions; on the other hand, the data amount required by the calculation of the confidence coefficient of the perception data and/or the confidence coefficient of the high-precision map is small, the calculation mode is simple, the calculation efficiency is high, and the real-time performance of the planning control strategy of the self-vehicle of the automatic driving can be adapted.
Further, the confidence level of the perceived data and the confidence level of the high-precision map may be calculated according to the first matching level and the second matching level. For example, the confidence of the perceived data may be positively correlated with a first degree of matching between the perceived lane of the perceived data and perceived lane of other perceived data, and/or a second degree of matching between the perceived lane of the perceived data and the map lane. In other words, the higher the perceived lane line of the perceived data matches with other perceived lane lines and/or map lane lines, the higher the confidence of the perceived data.
In some implementations, confidence T of sensory data of a visual sensor v =k 1 S 1 Wherein k is 1 Is a positive correlation parameter S 1 Is the first degree of matching between the visual lane and the laser point cloud lane. k (k) 1 May be empirically set-up based, determined based on historical lane line data, or determined based on artificial intelligence model predictions. At k 1 In the embodiment based on the historical lane line data, the confidence coefficient of the perception data can be calculated by utilizing different positive correlation parameters based on the historical lane line data, the planning control strategy of the self-vehicle is determined based on the confidence coefficient of the perception data, and the positive correlation parameter with highest safety of the planning control strategy of the self-vehicle is obtained.
In some implementations, confidence T of sensory data of a visual sensor v =k 2 S 2 Wherein k is 2 Is a positive correlation parameter S 2 Is a second degree of matching between the visual lane line and the map lane line. k (k) 2 Can be determined in accordance with k 1 Similarly, the description is omitted here.
In some implementations, confidence T of sensory data of a visual sensor v =k 3 (S 1 +S 2 ) 2, wherein k 3 Is a positive correlation parameter S 1 For the first matching degree between the visual lane line and the laser point cloud lane line, S 2 Is a second degree of matching between the visual lane line and the map lane line. K (K) 3 Can be determined in accordance with k 1 Similarly, the description is omitted here.
In other implementations, can be based on T v =k 1 S 1 ;T v =k 2 S 2 ;T v =k 3 (S 1 +S 2 ) And (2) respectively combining with the historical lane line data to test, and taking the confidence coefficient calculation mode with the highest safety of the planning control strategy of the own vehicle as the confidence coefficient calculation mode of the perception data.
The confidence of the sensing data of the radar sensor may be calculated in a similar manner to that of the sensing data of the vision sensor, and will not be described herein. The present application is not limited in this regard.
For another example, the confidence of the high-precision map may be positively correlated with a second degree of matching of the map lane line and the at least one perceived lane line. In other words, the higher the second degree of matching between the map lane lines and the perceived lane lines, the higher the confidence of the high-precision map.
In some implementations, confidence T of high-precision map m =k 4 S 2 Wherein k is 4 Is a positive correlation parameter S 2 Is a first degree of matching between the visual lane line and the map lane line. k (k) 4 May be empirically set-up based, determined based on historical lane line data, or determined based on artificial intelligence model predictions. At k 4 In embodiments based on historical lane line data determination, different normal phases may be utilized based on the historical lane line dataAnd calculating the confidence coefficient of the high-precision map by the related parameters, determining the planning control strategy of the own vehicle based on the confidence coefficient of the high-precision map, and obtaining the positive correlation parameter with highest safety of the planning control strategy of the own vehicle.
In some implementations, confidence T of high-precision map m =k 5 S 4 Wherein k is 5 Is a positive correlation parameter S 4 And the second matching degree between the laser point cloud lane line and the map lane line. k (k) 5 Can be determined in accordance with k 4 Similarly, the description is omitted here.
In some implementations, confidence T of high-precision map m =k 6 (S 2 +S 4 ) 2, wherein k 6 Is a positive correlation parameter S 2 For a second degree of match S between the visual lane and the map lane 4 And the second matching degree between the laser point cloud lane line and the map lane line. K (K) 6 Can be determined in accordance with k 4 Similarly, the description is omitted here.
In other implementations, can be based on T m =k 4 S 2 ;T m =k 5 S 4 ;T m =k 6 (S 2 +S 4 ) And (2) respectively combining with the historical lane line data to test, and taking the confidence coefficient calculation mode with the highest safety of the planning control strategy of the own vehicle as the confidence coefficient calculation mode of the high-precision map.
The present application may also implement different ways of calculating the matching degree and the confidence degree, which are not described herein.
In some embodiments, the at least one perceived lane line may include a visual lane line and a laser point cloud lane line. The visual lane line is generated by using the acquired perception data of the visual sensor, the laser point cloud lane line is generated by using the acquired perception data of the laser radar, and correspondingly, when determining that at least one of the second matching degrees is greater than a third preset threshold, determining that the confidence degree of the high-precision map is greater than the second preset threshold may include the following steps: determining the confidence coefficient of the high-precision map as a first confidence coefficient under the condition that the second matching degree between the visual lane line and the map lane line is larger than a third preset threshold value; and determining the confidence coefficient of the high-precision map as a second confidence coefficient under the condition that the second matching degree between the laser point cloud lane line and the map lane line is larger than a third preset threshold value. The first confidence coefficient and the second confidence coefficient are both larger than a second preset threshold value, and the first confidence coefficient is set to be larger than the second confidence coefficient.
In a specific implementation, assuming that the second preset threshold is 0.8 and the third preset threshold is 0.7, when the matching degree between the visual lane line and the map lane line is 0.9 and is greater than the third preset threshold, the confidence degree of the high-precision map is determined to be 1; when the matching degree between the laser point cloud lane line and the map lane line is 0.9 and is larger than a third preset threshold value, the confidence degree of the high-precision map is 0.9.
Specifically, in the present embodiment, the confidence of the map lane line is determined by the magnitude of the second degree of matching between the map lane line and the different perceived lane line. Specifically, since the data of the high-precision map is obtained based on the offline processing of the laser point cloud obtained by the laser sensor scanning, the difference between the sensing data obtained by the visual sensor and the high-precision map is larger than that of the sensing data of the laser point cloud, so that when the matching degree of the map lane line and the visual lane line is higher, the confidence of the map lane line can be determined to be higher.
The above embodiments are merely illustrative of one way to determine the confidence level of the map lane line of the present application, and the present application is not limited thereto, and may implement a variety of different ways to determine the confidence level of the map lane line based on the first matching level and the second matching level. For example, when the first matching degree and the second matching degree are both greater than the third preset threshold, that is, the map lane line, the visual lane line and the map lane line are all matched, the confidence degree of the high-precision map may be determined to be greater than or equal to the confidence degree T 1 The method comprises the steps of carrying out a first treatment on the surface of the When the second matching degree between the visual lane line and the map lane line is larger than a third preset threshold value, and the second matching degree between the laser point cloud lane line and the map lane line is larger than the third preset threshold value, the second matching degree between the laser point cloud lane line and the map lane line is larger than the third preset threshold valueUnder the condition that the matching degree and the first matching degree between the visual lane line and the laser point cloud lane line are smaller than a third preset threshold value, determining the confidence degree of the high-precision map as T 2 The method comprises the steps of carrying out a first treatment on the surface of the When the second matching degree between the laser point cloud lane line and the map lane line is larger than a third preset threshold, and the second matching degree between the visual lane line and the map lane line and the first matching degree between the visual lane line and the laser point cloud lane line are smaller than the third preset threshold, determining that the confidence degree of the high-precision map is T 3 The method comprises the steps of carrying out a first treatment on the surface of the When the first matching degree and the second matching degree are both smaller than the third preset threshold, that is, when the map lane line, the visual lane line and the map lane line are not matched, the confidence degree of the high-precision map is determined to be greater than or equal to the confidence degree T 4 . Wherein the confidence T 1 To T 4 The values decrease in sequence. Similarly, the confidence of the perceived lane line may also be determined according to the numerical ranges of the first matching degree and the second matching degree, and more variation manners may be implemented in the present application, which is not described herein.
In some embodiments, step S140 may include determining a planning control strategy for the own vehicle based on the perception data if it is determined that the confidence of the perception data is greater than a first preset threshold and the confidence of the high-precision map is less than or equal to a second preset threshold. Planning control strategies include, but are not limited to, vehicle positioning, prediction of actions of objects surrounding the vehicle, travel control strategies, and vehicle travel planning. In particular, the driving control strategy may include different autopilot driving functions, such as a full autopilot function, a lane centering function, and the like. Further, according to different perception data and confidence of the high-precision map, different driving control strategies can be correspondingly selected. For example, when the confidence of the high-precision map is higher, a full autopilot function may be selected; when the confidence of the perceived data is higher, a lane centering function may be selected. For another example, when the confidence level of the high-precision map is the first confidence level, a full-automatic driving function may be selected; when the confidence of the high-precision map is the second confidence, a lane line centering travel function may be selected. Many more variations are possible in this application and will not be described in detail herein. Thus, in the present embodiment, it is possible to perform vehicle positioning and behavior prediction of an object around the vehicle based on the awareness data, and to perform vehicle travel planning based on the vehicle positioning of the awareness data, the behavior prediction result, and the travel control strategy. Further, through the association relation between the confidence coefficient of the perception data and/or the confidence coefficient of the high-precision map and the driving control strategy, the determination efficiency of the driving control strategy can be improved, and the real-time performance of the planning control strategy of the self-vehicle of the automatic driving can be adapted. In addition, the vehicle positioning and the action prediction of the objects around the vehicle are performed only based on the perception data, and the vehicle running planning is performed based on the vehicle positioning, the action prediction result and the running control strategy of the perception data, so that the data quantity required to be processed can be reduced, and the real-time performance and the safety of the vehicle running planning can be improved.
In some embodiments, step S140 may include determining a planning control strategy for the own vehicle based on the high-precision map if it is determined that the confidence level of the high-precision map is greater than a second preset threshold and the confidence level of the perceived data is less than or equal to a first preset threshold. Thus, in the present embodiment, it is possible to perform vehicle positioning and behavior prediction of objects around the vehicle based on map data of a high-definition map, and to perform vehicle travel planning based on the vehicle positioning of the perception data, the behavior prediction result, and the travel control strategy. Further, through the association relation between the confidence coefficient of the perception data and/or the confidence coefficient of the high-precision map and the driving control strategy, the determination efficiency of the driving control strategy can be improved, and the real-time performance of the planning control strategy of the self-vehicle of the automatic driving can be adapted. In addition, the vehicle positioning and the action prediction of the objects around the vehicle are only performed based on the high-precision map, and the vehicle running planning is performed based on the vehicle positioning and the action prediction result of the high-precision map and the running control strategy, so that the data quantity required to be processed can be reduced, and the real-time performance and the safety of the vehicle running planning can be improved.
In other embodiments, step S140 may include determining a planning control strategy for the vehicle in combination with the awareness data and the high-precision map based on the confidence level of the awareness data and the confidence level of the high-precision map. Thus, in this embodiment, the vehicle positioning and the behavior prediction of the object around the vehicle may be performed based on the confidence level of the sensing data and the confidence level of the high-precision map, in combination with the sensing data and the high-precision map, and the vehicle travel planning may be performed based on the vehicle positioning of the sensing data, the behavior prediction result, and the travel control strategy. Further, through the association relation between the confidence coefficient of the perception data and/or the confidence coefficient of the high-precision map and the driving control strategy, the determination efficiency of the driving control strategy can be improved, and the real-time performance of the planning control strategy of the self-vehicle of the automatic driving can be adapted. In addition, based on the confidence coefficient of the perception data and the confidence coefficient of the high-precision map, the perception data and the high-precision map are combined, the vehicle is positioned, the actions of objects around the vehicle are predicted, the vehicle is positioned, the action prediction result and the running control strategy are performed, the vehicle running planning is performed, and the accuracy of positioning and behavior prediction can be improved.
The above is merely a way of schematically illustrating the determination of the planning control strategy of the present application, which is not limited in this regard, and the present application may implement more different planning control strategies of the own vehicle.
The above are merely a plurality of specific implementations of the vehicle planning control method of the present invention, and each implementation may be implemented independently or in combination, and the present invention is not limited thereto. Further, the flow chart of the present invention is merely illustrative, and the execution order of steps is not limited thereto, and the splitting, merging, sequential exchange, and other synchronous or asynchronous execution of steps are all within the scope of the present invention.
Referring now to fig. 2, fig. 2 shows a block diagram of a vehicle planning control device according to an embodiment of the present invention. The vehicle planning control device 200 includes a lane line acquisition module 210, a fusion matching state determination module 220, and a confidence determination module 230.
The lane line acquisition module 210 is configured to acquire at least one perceived lane line of a target road segment in front of an own vehicle and a map lane line corresponding to the target road segment in a high-precision map, the perceived lane line being generated using perceived data of the own vehicle;
The fusion matching state determination module 220 is configured to determine fusion matching states between the at least one perceived lane line, and between the at least one perceived lane line and the map lane line, respectively;
the confidence determining module 230 is configured to determine a confidence of the high-precision map and a confidence of the perceived data according to the fusion matching state;
the confidence level of the high-precision map and the confidence level of the perception data are used for determining a planning control strategy for the vehicle.
In the vehicle planning control device according to the exemplary embodiment of the present invention, the confidence level of the high-precision map and the confidence level of the perceived data are determined by the fusion matching states between at least one perceived lane and the map lane, respectively, so that the planning control strategy for the own vehicle is determined according to the confidence level of the high-precision map and the confidence level of the perceived data, thereby ensuring that the high-precision map data and/or the perceived data with higher confidence level can be used in the planning control strategy for the own vehicle, and avoiding the problem of positioning errors of the own vehicle due to data errors or data errors of the high-precision map data and the perceived data, thereby improving the safety of automatic driving planning.
Fig. 2 is a schematic illustration of a vehicle planning control device 200 provided by the present invention, and the splitting, combining and adding of the modules are all within the protection scope of the present invention without departing from the concept of the present invention. The vehicle planning control device 200 provided by the present invention may be implemented by software, hardware, firmware, plug-in and any combination thereof, which is not limited to this.
Referring now to fig. 3, fig. 3 shows a block diagram of a vehicle according to an embodiment of the present invention. The vehicle 300 includes a perception sensor 310, a perception fusion module 320, and a planning control module.
The perception sensor 310 is configured to acquire perception data of the vehicle.
The perceptual fusion module 320 is configured to: at least one perceived lane line of a target road section in front of a vehicle and a map lane line corresponding to the target road section in a high-precision map are obtained, and the perceived lane line is generated by utilizing perceived data of the vehicle; determining fusion matching states among the at least one sensing lane line and between the at least one sensing lane line and the map lane line respectively; and determining the confidence coefficient of the high-precision map and the confidence coefficient of the perception data according to the fusion matching state.
The planning control module may include, for example, a prediction module 350 and a planning module 360. The planning control module is configured to determine a planning control strategy for the own vehicle based on the confidence level of the high-precision map and the confidence level of the perception data.
In some embodiments, the perception sensor 310 may include a camera 311, a lidar 312, and a millimeter wave radar 313. The sensing sensor 310 may also include other sensing sensors, which are not described herein.
Specifically, the perception fusion module 320 may have a module structure of the vehicle planning control apparatus as shown in fig. 2. The perception fusion module 320 may also fuse the perception data of the perception sensor 310 with the high-precision map data provided by the high-precision map module 330. The high-precision map module 330 may interact with the positioning module 340 to determine the position of the host vehicle in the high-precision map. The positioning module 340 may also transmit positioning information to the sensing fusion module 320 for the sensing fusion module 320 to determine a correspondence between the sensing data and map data of the high-precision map.
Specifically, the prediction module 350 may predict the possible trajectories of the objects near the vehicle based on the confidence of the high-precision map provided by the perception fusion module 320 and the confidence of the perceived data, in combination with the high-precision map and/or the fusion result of the perception fusion module 320. The planning module 360 may utilize the predicted trajectory, the high-precision map, based on the confidence of the high-precision map provided by the perception fusion module 320 and the confidence of the perception data. Vehicle positioning and other information makes track planning for the automatic driving vehicle.
In the vehicle 300 according to the exemplary embodiment of the present invention, the confidence level of the high-precision map and the confidence level of the perceived data are determined by the fusion matching states between at least one perceived lane and the map lane, respectively, so that the planning control strategy for the own vehicle is determined according to the confidence level of the high-precision map and the confidence level of the perceived data, thereby ensuring that the high-precision map data and/or the perceived data with higher confidence level can be used in the planning control strategy for the own vehicle, and avoiding the problem of positioning errors of the own vehicle due to data errors or data errors of the high-precision map data and the perceived data, thereby improving the safety of the automatic driving planning.
In an exemplary embodiment of the present invention, there is also provided a computer-readable storage medium having stored thereon a computer program which, when executed by, for example, a processor, can implement the steps of the vehicle planning control method described in any one of the above embodiments. In some possible embodiments, the various aspects of the invention may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the invention as described in the above-mentioned vehicle planning control method section of this specification, when said program product is run on the terminal device.
Referring to fig. 4, a program product 700 for implementing the above-described method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable storage medium may include a data signal propagated in baseband or as part of a carrier wave, with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable storage medium may also be any readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the tenant computing device, partially on the tenant device, as a stand-alone software package, partially on the tenant computing device, partially on a remote computing device, or entirely on a remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the tenant computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected through the internet using an internet service provider).
In an exemplary embodiment of the invention, an electronic device is also provided, which may include a processor, and a memory for storing executable instructions of the processor. Wherein the processor is configured to perform the steps of the vehicle planning control method of any of the above embodiments via execution of the executable instructions.
Those skilled in the art will appreciate that the various aspects of the invention may be implemented as a system, method, or program product. Accordingly, aspects of the invention may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
An electronic device 500 according to such an embodiment of the invention is described below with reference to fig. 5. The electronic device 500 shown in fig. 5 is merely an example, and should not be construed as limiting the functionality and scope of use of embodiments of the present invention.
As shown in fig. 5, the electronic device 500 is embodied in the form of a general purpose computing device. The components of electronic device 500 may include, but are not limited to: at least one processing unit 510, at least one memory unit 520, a bus 530 connecting the different system components (including the memory unit 520 and the processing unit 510), a display unit 540, etc.
Wherein the storage unit stores program code executable by the processing unit 510 such that the processing unit 510 performs the steps according to various exemplary embodiments of the present invention described in the above-described vehicle planning control method section of the present specification. For example, the processing unit 510 may execute the steps of the vehicle planning control method according to any of the above embodiments.
The memory unit 520 may include readable media in the form of volatile memory units, such as Random Access Memory (RAM) 5201 and/or cache memory unit 5202, and may further include Read Only Memory (ROM) 5203.
The storage unit 520 may also include a program/utility 5204 having a set (at least one) of program modules 5205, such program modules 5205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
Bus 530 may be one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 500 may also communicate with one or more external devices 600 (e.g., keyboard, pointing device, bluetooth device, etc.), one or more devices that enable a tenant to interact with the electronic device 500, and/or any device (e.g., router, modem, etc.) that enables the electronic device 500 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 550. Also, electronic device 500 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through network adapter 560. The network adapter 560 may communicate with other modules of the electronic device 500 via the bus 530. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 500, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
The electronic device 500 may be a vehicle having an autopilot function, or other component having an autopilot function. The electronic device 500 includes, but is not limited to: the method provided by the application is implemented by other sensors such as a vehicle-mounted terminal, a vehicle-mounted controller, a vehicle-mounted module, a vehicle-mounted component, a vehicle-mounted chip, a vehicle-mounted unit, a vehicle-mounted radar or a vehicle-mounted camera, and the vehicle can pass through the vehicle-mounted terminal, the vehicle-mounted controller, the vehicle-mounted module, the vehicle-mounted component, the vehicle-mounted chip, the vehicle-mounted unit, the vehicle-mounted radar or the camera.
The electronic device 500 may also be or be provided in an intelligent terminal having an autopilot function other than a vehicle. The intelligent terminal can be other terminal equipment such as intelligent transportation equipment, intelligent household equipment, robots and the like. The electronic device 500 includes, but is not limited to, a smart terminal or a controller within a smart terminal, other sensors such as a chip, radar or camera, and other components, etc.
The electronic device 500 may be a general purpose device or a special purpose device. In a specific implementation, the device may also be a desktop, a laptop, a web server, a palmtop (personaldigital assistant, PDA), a mobile handset, a tablet, a wireless terminal device, an embedded device, or other device with processing functionality. The embodiments of the present application are not limited in the type of electronic device 500.
The electronic device 500 may also be a chip or a processor with processing functionality, and the electronic device 500 may include multiple processors. The processor may be a single-core (single-CPU) processor or a multi-core (multi-CPU) processor. The chip or the processor with the processing function can be arranged in the sensor or not arranged in the sensor, but arranged at the receiving end of the output signal of the sensor.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiment of the present invention may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a usb disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, or a network device, etc.) to execute the vehicle planning control method according to the embodiment of the present invention.
Compared with the prior art, the invention has the advantages that:
the confidence of the high-precision map and the confidence of the perceived data are determined through fusion matching states among at least one perceived lane line and between at least one perceived lane line and the map lane line respectively, so that the planning control strategy of the vehicle is determined according to the confidence of the high-precision map and the confidence of the perceived data, the high-precision map data and/or the perceived data with higher confidence can be used in the planning control strategy of the vehicle, and the problem of positioning errors of the vehicle due to data errors or data errors of the high-precision map data and the perceived data is avoided, thereby improving the safety of automatic driving planning.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.

Claims (10)

1. A vehicle planning control method, characterized by comprising:
at least one perceived lane line of a target road section in front of a vehicle and a map lane line corresponding to the target road section in a high-precision map are obtained, and the perceived lane line is generated by utilizing perceived data of the vehicle;
determining fusion matching states among the at least one sensing lane line and between the at least one sensing lane line and the map lane line respectively;
determining the confidence coefficient of the high-precision map and the confidence coefficient of the perception data according to the fusion matching state;
and determining a planning control strategy for the vehicle based on the confidence level of the high-precision map and the confidence level of the perception data.
2. The vehicle planning control method of claim 1, wherein the fusion match state includes a first degree of match between the at least one perceived lane line, and a second degree of match between the at least one perceived lane line and the map lane line, respectively.
3. The vehicle planning control method according to claim 2, characterized in that the determining the confidence of the high-precision map and the perceived data according to the fusion matching state includes:
under the condition that the first matching degree meets a first preset condition, determining that the confidence degree of the perception data is larger than a first preset threshold value;
and/or determining that the confidence coefficient of the high-precision map is larger than a second preset threshold value under the condition that the second matching degree meets a second preset condition.
4. The vehicle planning control method according to claim 3, wherein the determining that the confidence level of the high-precision map is greater than a second preset threshold value in the case where it is determined that the second matching level satisfies a second preset condition includes:
and determining that the confidence coefficient of the high-precision map is larger than a second preset threshold value under the condition that at least one of the second matching degrees is larger than a third preset threshold value.
5. The vehicle planning control method of claim 4, wherein the at least one perceived lane includes a visual lane and a laser point cloud lane, wherein the visual lane is generated using perceived data acquired by a visual sensor and the laser point cloud lane is generated using perceived data acquired by a laser radar, and correspondingly, the determining that the confidence of the high-precision map is greater than a second preset threshold if at least one of the second matches is determined to be greater than a third preset threshold comprises:
determining the confidence coefficient of the high-precision map as a first confidence coefficient under the condition that the second matching degree between the visual lane line and the map lane line is larger than a third preset threshold value;
determining the confidence coefficient of the high-precision map as a second confidence coefficient under the condition that the second matching degree between the laser point cloud lane line and the map lane line is larger than a third preset threshold value;
the first confidence coefficient and the second confidence coefficient are both larger than a second preset threshold value, and the first confidence coefficient is set to be larger than the second confidence coefficient.
6. The vehicle planning control method according to claim 1, wherein the determining a planning control strategy for the own vehicle based on the high-precision map and the confidence of the perception data includes:
And determining a planning control strategy for the own vehicle based on the perception data under the condition that the confidence coefficient of the perception data is determined to be larger than a first preset threshold value and the confidence coefficient of the high-precision map is smaller than or equal to a second preset threshold value.
7. The vehicle planning control method according to claim 1, wherein the determining a planning control strategy for the own vehicle based on the high-precision map and the confidence of the perception data includes:
and determining a planning control strategy for the vehicle based on the high-precision map under the condition that the confidence coefficient of the high-precision map is larger than a second preset threshold value and the confidence coefficient of the perception data is smaller than or equal to a first preset threshold value.
8. A vehicle planning control device, characterized by comprising:
the system comprises a lane line acquisition module, a control module and a control module, wherein the lane line acquisition module is configured to acquire at least one sensing lane line of a target road section in front of a vehicle and a map lane line corresponding to the target road section in a high-precision map, and the sensing lane line is generated by using sensing data of the vehicle;
a fusion matching state determining module configured to determine fusion matching states between the at least one perceived lane line and the map lane line, respectively;
The confidence determining module is configured to determine the confidence of the high-precision map and the confidence of the perception data according to the fusion matching state;
the confidence level of the high-precision map and the confidence level of the perception data are used for determining a planning control strategy for the vehicle.
9. An electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to implement the method of any of claims 1-7 when executing the instructions.
10. A vehicle, characterized by comprising:
a perception sensor configured to acquire perception data of the own vehicle;
a perception fusion module configured to:
at least one perceived lane line of a target road section in front of a vehicle and a map lane line corresponding to the target road section in a high-precision map are obtained, and the perceived lane line is generated by utilizing perceived data of the vehicle;
determining fusion matching states among the at least one sensing lane line and between the at least one sensing lane line and the map lane line respectively;
determining the confidence coefficient of the high-precision map and the confidence coefficient of the perception data according to the fusion matching state;
And the planning control module is configured to determine a planning control strategy for the vehicle based on the confidence of the high-precision map and the confidence of the perception data.
CN202310179794.1A 2023-02-27 2023-02-27 Vehicle planning control method and device, electronic equipment and vehicle Pending CN116353627A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310179794.1A CN116353627A (en) 2023-02-27 2023-02-27 Vehicle planning control method and device, electronic equipment and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310179794.1A CN116353627A (en) 2023-02-27 2023-02-27 Vehicle planning control method and device, electronic equipment and vehicle

Publications (1)

Publication Number Publication Date
CN116353627A true CN116353627A (en) 2023-06-30

Family

ID=86912685

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310179794.1A Pending CN116353627A (en) 2023-02-27 2023-02-27 Vehicle planning control method and device, electronic equipment and vehicle

Country Status (1)

Country Link
CN (1) CN116353627A (en)

Similar Documents

Publication Publication Date Title
EP3627181A1 (en) Multi-sensor calibration method, multi-sensor calibration device, computer device, medium and vehicle
EP3620995B1 (en) Method and apparatus for determining static state of obstacle
CN110263713B (en) Lane line detection method, lane line detection device, electronic device, and storage medium
CN113899363B (en) Vehicle positioning method and device and automatic driving vehicle
CN111460375B (en) Method, device, equipment and medium for determining validity of positioning data
CN113859264B (en) Vehicle control method, device, electronic equipment and storage medium
CN109635861B (en) Data fusion method and device, electronic equipment and storage medium
US20200361452A1 (en) Vehicles and methods for performing tasks based on confidence in accuracy of module output
CN113537362A (en) Perception fusion method, device, equipment and medium based on vehicle-road cooperation
CN112596071A (en) Unmanned aerial vehicle autonomous positioning method and device and unmanned aerial vehicle
CN114750759A (en) Following target determination method, device, equipment and medium
CN114677655A (en) Multi-sensor target detection method and device, electronic equipment and storage medium
CN109635868B (en) Method and device for determining obstacle type, electronic device and storage medium
JP2023038164A (en) Obstacle detection method, device, automatic driving vehicle, apparatus, and storage medium
CN114815851A (en) Robot following method, robot following device, electronic device, and storage medium
CN113177980A (en) Target object speed determination method and device for automatic driving and electronic equipment
CN109188419B (en) Method and device for detecting speed of obstacle, computer equipment and storage medium
CN116353627A (en) Vehicle planning control method and device, electronic equipment and vehicle
CN116343169A (en) Path planning method, target object motion control device and electronic equipment
CN114120252B (en) Automatic driving vehicle state identification method and device, electronic equipment and vehicle
CN114088104B (en) Map generation method under automatic driving scene
CN115147561A (en) Pose graph generation method, high-precision map generation method and device
CN114919661A (en) Parking control method, device, equipment and storage medium
CN114964204A (en) Map construction method, map using method, map constructing device, map using equipment and storage medium
CN113361379B (en) Method and device for generating target detection system and detecting target

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination