CN117029857A - Vehicle perception fusion method, device and storage medium based on high-precision map - Google Patents

Vehicle perception fusion method, device and storage medium based on high-precision map Download PDF

Info

Publication number
CN117029857A
CN117029857A CN202311033962.2A CN202311033962A CN117029857A CN 117029857 A CN117029857 A CN 117029857A CN 202311033962 A CN202311033962 A CN 202311033962A CN 117029857 A CN117029857 A CN 117029857A
Authority
CN
China
Prior art keywords
lane line
lane
line equation
map
equation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311033962.2A
Other languages
Chinese (zh)
Inventor
王鹏程
田磊
刘阳
魏维
孙心洁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China National Heavy Duty Truck Group Jinan Power Co Ltd
Original Assignee
China National Heavy Duty Truck Group Jinan Power Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China National Heavy Duty Truck Group Jinan Power Co Ltd filed Critical China National Heavy Duty Truck Group Jinan Power Co Ltd
Priority to CN202311033962.2A priority Critical patent/CN117029857A/en
Publication of CN117029857A publication Critical patent/CN117029857A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • G01C21/3492Special cost functions, i.e. other than distance or default speed limit of road segments employing speed data or traffic data, e.g. real-time or historical

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application provides a vehicle perception fusion method, device and storage medium based on a high-precision map, and relates to the technical field of vehicle driving. According to the method, a corresponding sensing lane line equation is generated through the acquired road sensing information fitting, a corresponding map lane line equation is generated in a converted running coordinate system according to the map data corresponding to the vehicle, a fusion lane line equation is acquired according to the normally output sensing lane line equation and the map lane line equation, the data information of different sources is further fused, the relation between the sensing information of the intelligent driving system and high-precision map data is enriched, lanes of other vehicles are determined according to the fusion lane line equation and the position information of the other vehicles, a planned running track is acquired according to the fusion lane line and the other vehicles after the lanes are determined, and vehicle control is realized according to the planned running track and the position information of the vehicles, so that the sensing robustness and stability of the intelligent driving system of the vehicle are improved.

Description

Vehicle perception fusion method, device and storage medium based on high-precision map
Technical Field
The application relates to a vehicle driving technology, in particular to a vehicle perception fusion method, device and storage medium based on a high-precision map.
Background
The high-precision map plays an important role in an intelligent driving system of the vehicle, such as enabling a plurality of modules of sensing, positioning, planning and the like of the vehicle to improve the performance of the vehicle, wherein the sensing module of the vehicle can help to improve the sensing distance of the vehicle, enrich sensing information, expand sensing boundaries and make driving decisions in advance; aiming at a positioning module of the vehicle, the relative position can be calculated through road signs and the like so as to realize the loose coupling positioning in the transverse and longitudinal directions; the lane-level path planning and the fine track planning can be realized aiming at a planning module of the vehicle.
In the prior art, a high-precision map is usually used as auxiliary positioning, but a sensing fusion scheme based on the high-precision map is rarely used for fusing lane lines of an intelligent camera and the high-precision map at present, but the scheme cannot be used in a high-level intelligent driving domain controller system, or lane line attribute information is acquired through intelligent camera detection of lane line and road edge information and the high-precision map, and the lane line search is completed through positioning continuity detection, lane search, attribute matching detection and distance matching detection, but the scheme only realizes the lane line matching function, does not comprise the lane line fusion function of sensing and the high-precision map, does not fully mine the relationship between sensing information of an intelligent driving system and high-precision map data, does not fuse different source data information, and has single sensing information, so that the sensing robustness and stability of the intelligent driving system of a vehicle are low.
Therefore, the prior art still lacks in aspects of low perceived robustness and stability of the intelligent driving system of the vehicle in terms of insufficient exploitation of the relation between the perceived information of the intelligent driving system and the high-precision map data.
Disclosure of Invention
The application provides a vehicle perception fusion method, device and storage medium based on a high-precision map, which are used for solving the problem that the prior art still lacks in the aspects of insufficient exploitation of the relation between the perception information of an intelligent driving system and high-precision map data and lower perception robustness and stability of the intelligent driving system of a vehicle.
In a first aspect, the present application provides a clutch assist mechanism tuning control method, including:
acquiring road perception information through a road perception unit of a vehicle, and generating a corresponding perception lane line equation and perception lane line confidence coefficient according to the road perception information in a fitting way, wherein the perception lane line confidence coefficient is used for indicating the reliability of the perception lane line;
acquiring map data according to positioning data of a vehicle, establishing a road section lane model according to the map data, converting the longitude and latitude point coordinates of a lane line of the road section lane model into a running coordinate system of the vehicle, and fitting in the running coordinate system to generate a corresponding map lane line equation;
And acquiring a fused lane line equation according to the normally output perceived lane line equation and the map lane line equation, and determining lanes of other vehicles according to the fused lane line equation and the position information of the other vehicles acquired by the road perception unit.
In one possible design, the method further includes, when the acquiring a fused lane-line equation according to the perceived lane-line equation and the map lane-line equation:
checking each sensing lane line indicated by the sensing lane line equation, wherein if the lane line confidence is greater than a preset sensing confidence and the lane line rationality condition is met, the normal output of the corresponding sensing lane line indicated by the sensing lane line equation is confirmed;
and checking each map lane line indicated by the map lane line equation, wherein if the positioning data meets the accuracy requirement, the map lane line is smooth, and the map lane line rationality condition and the perceived lane line checking and evaluating condition are met, the corresponding map lane line indicated by the map lane line equation is confirmed to be normally output.
In one possible design, the meeting lane line rationality conditions includes:
Obtaining lane line equation coefficients of the perceived lane line equation and the map lane line equation;
the method comprises the steps of obtaining coefficient difference values of each frame lane line equation coefficient and the corresponding previous frame lane line equation coefficient, taking absolute values of the coefficient difference values, and obtaining the change value of each lane line equation coefficient;
acquiring a width difference value of the adjacent lane line equation coefficients, and taking an absolute value of the width difference value as a lane width;
and if the lane line equation coefficient is in a preset equation coefficient interval, determining that the lane line rationality condition is met when the change value of each sensing lane line equation coefficient is not larger than the corresponding preset value and the lane width is in a preset width interval.
In one possible design, the meeting the perceived lane line verification evaluation condition includes:
obtaining lane line equation coefficients of the perceived lane line equation and the map lane line equation;
and acquiring coefficient difference values of each sensing lane line equation coefficient and the corresponding map lane line equation coefficient, taking the absolute value of each coefficient difference value as a check value, and if each check value is not greater than each corresponding preset check value, confirming that the sensing lane line check evaluation condition is met.
In one possible design, the obtaining a fused lane-line equation according to the perceived lane-line equation and the map lane-line equation that are normally output includes:
if all the perceived lane lines indicated by the perceived lane line equation are detected to be normally output, all the perceived lane lines are used as the fused lane line equation;
if partial normal output and partial abnormal output of the sensing lane line indicated by the sensing lane line equation are detected, and all the map lane lines indicated by the map lane line equation are normally output, replacing the sensing lane line equation coefficient indicating abnormal output in the sensing lane line equation by the corresponding map lane line equation coefficient in the map lane line equation so as to complete the sensing lane line equation, and confirming the completed sensing lane line equation as the fused lane line equation;
if all the sensing lane lines indicated by the sensing lane line equation are detected to be abnormal in output, acquiring a fusion lane line equation according to the map lane lines which are normally output.
In one possible design, if all the sensing lane lines indicated by the sensing lane line equation are detected to be abnormal, acquiring a fused lane line equation according to the map lane lines which are normally output, including:
Acquiring the output condition of a map lane line of a preset frame number before the current moment;
if all the map lane lines of the preset frame number indicated by the map lane line equation are normally output, determining the map lane line equation as the fused lane line equation;
if the fact that all the lane lines indicated by the sensing lane line equation and the map lane line equation are abnormal in output is detected, the fusion lane line equation is not confirmed.
In one possible design, the determining the lane to which the other vehicle belongs according to the fused lane line equation and the position information of the other vehicle acquired by the road sensing unit includes:
obtaining the lane distances between other vehicles and each fusion lane line indicated by the fusion lane line equation, and confirming a lane formed by two fusion lane lines with the minimum lane distance as a lane corresponding to the other vehicles;
if the smallest lane formed by the adjacent fusion lane lines with the same smallest lane distance is detected, determining the lane formed by the adjacent fusion lane lines with the highest fusion lane line confidence as the lane corresponding to the other vehicles in the smallest lane.
In a second aspect, the present application provides a vehicle perception fusion device based on a high-precision map, including:
The first acquisition module is used for acquiring road perception information through a road perception unit of the vehicle, and generating a corresponding perception lane line equation and perception lane line confidence coefficient according to the road perception information in a fitting way, wherein the perception lane line confidence coefficient is used for indicating the reliability of the perception lane line;
the second acquisition module is used for acquiring map data according to the positioning data of the vehicle, establishing a road section lane model according to the map data, converting the longitude and latitude point coordinates of a lane line of the road section lane model into a running coordinate system of the vehicle, and fitting in the running coordinate system to generate a corresponding map lane line equation;
the processing module is used for acquiring a fused lane line equation according to the normally output sensing lane line equation and the map lane line equation, and determining lanes of other vehicles according to the fused lane line equation and the position information of the other vehicles acquired by the road sensing unit;
and the execution module is used for acquiring a planned running track according to the fusion lane line indicated by the fusion lane line equation and the other vehicles after determining the belonged lane, and realizing vehicle control according to the planned running track and the running information of the vehicles.
In a third aspect, the present application provides an electronic device comprising: a processor, and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
and the processor executes the computer-executed instructions stored in the memory to realize the vehicle perception fusion method based on the high-precision map.
In a fourth aspect, the present application provides a computer-readable storage medium having stored therein computer-executable instructions that, when executed by a processor, are configured to implement a high-precision map-based vehicle perception fusion method.
The application provides a vehicle perception fusion method, equipment and a storage medium based on a high-precision map, which are characterized in that a corresponding perception lane line equation is generated through fitting acquired road perception information, a road section lane model is established according to map data corresponding to a vehicle, after the longitude and latitude point coordinates of a lane line of the road section lane model are converted into a running coordinate system of the vehicle, a corresponding map lane line equation is generated through fitting in the running coordinate system, the fusion lane line equation is acquired according to the normally output perception lane line equation and the map lane line equation, the fusion of different source data information is further realized, the relation between intelligent driving system perception information and the high-precision map data is enriched, lanes of other vehicles are determined according to the fusion lane line equation and the acquired position information of other vehicles, a planned running track is acquired according to the fusion lane line and the other vehicles after the lane is determined, and vehicle control is realized according to the planned running track and the position information of the vehicles, and therefore the perception robustness and the stability of the intelligent driving system of the vehicle are improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of a high-precision map fusion architecture according to an embodiment of the present application;
fig. 2 is a schematic flow chart of a vehicle perception fusion method based on a high-precision map according to an embodiment of the present application;
fig. 3 is a second schematic flow chart of a vehicle perception fusion method based on a high-precision map according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a vehicle sensing fusion device based on a high-precision map according to an embodiment of the present application;
fig. 5 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of apparatus and methods consistent with some aspects of the application, as detailed in the accompanying claims, rather than all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
In automatic driving, the vehicle positioning data and the lane line information are generally obtained by adopting a multi-sensor fusion mode, wherein the lane line information is taken as important environment sensing information in automatic driving, so that when a vehicle safely runs in a current lane, the vehicle line planning can be carried out according to the lane line information of other surrounding vehicles, and the running information of other vehicles is mastered, thereby ensuring the driving safety of the vehicle.
The lane line information is generally obtained by detecting and identifying the lane line through a plurality of sensors such as a plurality of intelligent cameras, or according to high-precision map data, if the lane line is detected and identified through a plurality of sensors such as the intelligent cameras, the intelligent cameras are easily affected by surrounding environment and cannot accurately obtain data, so that an edge scene which cannot be processed exists in a scheme of obtaining the lane line information by fusing the data of the plurality of sensors, if the lane line information is identified according to the high-precision map data, the high-precision map has a manufacturing period, and thus, the automatic driving vehicle has the problem of map data errors such as untimely updating due to the use of the high-precision map, and inaccurate lane line information can be generated.
Of course, there is also the data fusion that combines a plurality of sensors and high-precision map to obtain lane line information, such as through intelligent camera and high-precision map positioning controller redundancy supplementary output lane line, under complicated operating mode, improve the stability and the accuracy of lane line output, but this scheme is limited to intelligent camera and high-precision map's lane line fusion only, can't use in high-level intelligent driving domain controller system, or obtain lane line attribute information through intelligent camera detection lane line and road edge information and high-precision map, through location continuity detection, lane search, attribute matching detection and distance matching detection, so as to accomplish the search of lane line, but this kind of scheme has only realized the function of lane line matching, do not include the lane line fusion function of perception and high-precision map, and the relation of intelligent driving system perception information and high-precision map data is not fully excavated, the perception information is comparatively single, therefore intelligent driving system's of vehicle perception robustness and stability are lower.
The application provides a vehicle perception fusion method based on a high-precision map, which is characterized in that a corresponding perception lane line equation is generated through fitting acquired road perception information, a road section lane model is established according to map data corresponding to a vehicle, after the lane line longitude and latitude point coordinates of the road section lane model are converted into a vehicle running coordinate system, the corresponding map lane line equation is generated through fitting in the running coordinate system, the fusion lane line equation is acquired according to the normally output perception lane line equation and the map lane line equation, the fusion of different source data information is realized, the relationship between perception information of an intelligent driving system and high-precision map data is enriched, lanes of other vehicles are determined according to the fusion lane line equation and the position information of the other vehicles, the planned running track is acquired according to the fusion lane line and the other vehicles after the lanes are determined, and the vehicle control is realized according to the planned running track and the running information of the vehicles, so that the perception robustness and the stability of an intelligent driving system of the vehicle are improved.
The following describes the technical scheme of the present application and how the technical scheme of the present application solves the above technical problems by adopting specific embodiments. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
Example 1
Fig. 1 is a schematic diagram of a high-precision map fusion architecture according to an embodiment of the present application. Fig. 2 is a schematic flow chart of a vehicle perception fusion method based on a high-precision map according to an embodiment of the present application. As shown in connection with fig. 1 and 2, the method comprises:
s201, road perception information is obtained through a road perception unit of a vehicle, and a corresponding perception lane line equation and perception lane line confidence coefficient are generated through fitting according to the road perception information, wherein the perception lane line confidence coefficient is used for indicating the reliability of the perception lane line;
specifically, a laser radar point cloud and an image of a road environment acquired by a laser radar and a camera in a sensor unit are sent to a road sensing unit, the road sensing unit senses other vehicles and detected lane lines according to the acquired laser radar point cloud and image, fits the detected lane lines, and sends a sensed lane line equation, sensed lane line confidence and sensed other vehicle position information generated after fitting to a fusion application set of a high-precision map fusion sensing unit so as to realize lane line fusion with a map lane line equation generated according to map data.
S202, acquiring map data according to positioning data of a vehicle, establishing a road section lane model according to the map data, converting coordinates of longitude and latitude points of a lane line of the road section lane model into a running coordinate system of the vehicle, and fitting in the running coordinate system to generate a corresponding map lane line equation;
specifically, a map engine in the high-precision map fusion sensing unit acquires map data through positioning data, namely positioning information, acquired by inertial navigation equipment, and sends the map data to a map reconstructor so as to perform inverse sequence analysis on the map data, map reconstruction is performed through an analyzed data message body, association relations are constructed, path characteristics are maintained, namely a road section lane model is established according to the map data, coordinates of longitude and latitude points of lane lines in the road section lane model are converted into relative coordinates of a coordinate system of the vehicle according to position and orientation information of the vehicle, and then lane line fitting is performed to acquire corresponding map lane line equations.
S203, acquiring a fused lane line equation according to the normally output sensing lane line equation and the map lane line equation, and determining lanes to which other vehicles belong according to the fused lane line equation and the position information of the other vehicles acquired by the road sensing unit;
Specifically, the fusion application set of the high-precision map fusion sensing unit is used for selecting or partially selecting the sensing lane line output by the road sensing unit and the map lane line output by the map reconstructor so as to realize fusion of the sensing lane line and the map lane line, so that the fusion lane line is obtained, and target screening is carried out according to the obtained position information of other vehicles and the fusion lane line, so that lanes to which other vehicles belong are determined.
S204, acquiring a planned running track according to the fusion lane line indicated by the fusion lane line equation and the other vehicles after the lane to which the vehicle belongs is determined, and controlling the vehicle according to the planned running track and the running information of the vehicle;
specifically, after the fused lane line is obtained and the lanes of other vehicles are confirmed, the lane where the vehicle is located is confirmed according to the fused lane line, the attribute information of the lane where the vehicle is located is obtained, decision and motion planning, namely, the planned driving track is obtained according to the fused lane line output by the high-precision map fusion sensing unit and the driving information of the other vehicles after the lanes are confirmed, and vehicle control is performed according to the planned driving track and the attribute information of the lanes.
The application provides a vehicle perception fusion method based on a high-precision map, which is characterized in that a corresponding perception lane line equation is generated through fitting acquired road perception information, a road section lane model is established according to map data corresponding to a vehicle, after the lane line longitude and latitude point coordinates of the road section lane model are converted into a vehicle running coordinate system, the corresponding map lane line equation is generated through fitting in the running coordinate system, the fusion lane line equation is acquired according to the normally output perception lane line equation and the map lane line equation, the fusion of different source data information is realized, the relation between perception information of an intelligent driving system and high-precision map data is enriched, lanes of other vehicles are determined according to the fusion lane line equation and the acquired position information of other vehicles, a planned running track is acquired according to the fusion lane line and the other vehicles after the lane is determined, and the vehicle is controlled according to the planned running track and the position information of the vehicles, so that the perception robustness and the stability of the intelligent driving system of the vehicle are improved.
The vehicle perception fusion method based on the high-precision map is described in detail below by adopting a specific embodiment.
Example two
Fig. 3 is a schematic flow chart diagram II of a vehicle perception fusion method based on a high-precision map according to an embodiment of the present application. As shown in fig. 3, the method includes:
s301, road perception information is obtained through a road perception unit of a vehicle, and a corresponding perception lane line equation and perception lane line confidence coefficient are generated through fitting according to the road perception information, wherein the perception lane line confidence coefficient is used for indicating the reliability of the perception lane line;
specifically, in this embodiment, the implementation of S301 is similar to the implementation of S201 in the first embodiment of the present application, and will not be described here again.
S302, acquiring map data according to positioning data of a vehicle, establishing a road section lane model according to the map data, converting coordinates of longitude and latitude points of a lane line of the road section lane model into a running coordinate system of the vehicle, and fitting in the running coordinate system to generate a corresponding map lane line equation;
specifically, a map engine in the high-precision map fusion sensing unit obtains map data according to positioning information of receiving inertial navigation equipment, namely positioning data, and sends the map data to a map reconstructor through an ADAIS v3 protocol, the map reconstructor analyzes ADAIS v3 information to obtain structured map data, and builds association relations by processing control type information bodies, position information bodies, attribute information bodies and global information, timely removes previous paths and elements thereof, creates new paths, updates elements, and builds paths according to the topology information of the previous road network The section and lane model converts the longitude and latitude point coordinates of the lane line output by the map reconstructor into the relative coordinates of the coordinate system of the vehicle according to the position and orientation information of the vehicle, and then carries out lane line fitting to obtain a lane line triple equation Y=A representing the lane line of the map 0 +A 1 X+A 2 X 2 +A 3 X 3
S303, checking each perceived lane line indicated by the perceived lane line equation;
specifically, after a corresponding sensing lane line equation and sensing lane line confidence are generated according to road sensing information fitting, whether four lane lines of a road where the current vehicle is located, namely, left, right, can be confirmed, if the corresponding four lane lines can be confirmed, the confirmed lane lines are further checked, and if the corresponding four lane lines cannot be confirmed, a new sensing lane line equation and sensing lane line confidence are generated according to the newly acquired road sensing information fitting comprising laser radar point cloud and image data.
S304, if the lane line confidence is larger than the preset perception confidence and meets the lane line rationality condition, confirming that the corresponding perception lane line indicated by the perception lane line equation is normally output;
specifically, lane line equation coefficients of a sensing lane line equation and a map lane line equation are obtained, coefficient difference values of each own frame lane line equation coefficient and a corresponding previous frame lane line equation coefficient are obtained, absolute values of each coefficient difference value are obtained, each lane line equation coefficient change value is obtained, width difference values of adjacent lane line equation coefficients are obtained, the absolute values of the width difference values are used as lane widths, when the sensing lane line equation coefficient is detected to be more than 60% and is in a preset equation coefficient interval, each sensing lane line equation coefficient change value is not more than a corresponding preset value, and the lane width is in a preset width interval, the corresponding sensing lane line meeting the preset sensing confidence and lane line rationality conditions is confirmed to be normal output, otherwise abnormal output is confirmed;
Further, for perceived lane line correspondenceThe change value of the coefficient of each perceived lane line equation is not more than a corresponding preset value, |A 0 -A' 0 |≤0.05,|A 1 -A' 1 |≤0.001,|A 2 -A' 2 |≤0.00001,|A 3 -A' 3 I is less than or equal to 0.000001, wherein A x For the frame, the equation coefficient of the lane line is perceived, A' x For the previous frame perceived lane line coefficients, x=0, 1,2,3;
further, for obtaining the width difference of the adjacent lane line equation coefficients, and taking the absolute value of the width difference as the lane width, 1.25.ltoreq.AL 0 -AR 0 I is less than or equal to 4, wherein, AL 0 Equation coefficient A for left lane line 0 ,AR 0 For the right lane line equation coefficient, for the lane line equation coefficient in a preset equation coefficient interval, -4 is less than or equal to A 0 And the value is less than or equal to 4, and other parameters can be set for the coefficient interval of the preset equation as long as the normal output of the sensing lane line can be met.
S305, checking each map lane line indicated by the map lane line equation;
specifically, after a corresponding map lane line equation is generated by fitting in the longitude and latitude point coordinates of the lane line in the running coordinate system of the vehicle, the map lane line indicated by the map lane line equation needs to be checked, if abnormal output of the corresponding map lane line indicated by the map lane line equation is checked, the latest map data is acquired again according to the current positioning data, so as to generate a new map lane line equation.
S306, if the positioning data meet the accuracy requirement, the map lane lines are smooth, and the map lane line equation indicates the corresponding map lane lines to be normally output when the lane line rationality condition and the perceived lane line verification evaluation condition are met;
specifically, lane line equation coefficients of a sensing lane line equation and a map lane line equation are obtained, coefficient difference values of the sensing lane line equation coefficients and corresponding map lane line equation coefficients are obtained, absolute values of the coefficient difference values are used as check values, if the check values are not larger than corresponding preset check values, the sensing lane line check evaluation conditions are confirmed to be met, and normal output is confirmed to the map lane lines meeting the sensing lane line check evaluation conditions and the lane line rationality conditions, otherwise abnormal output is confirmed;
further, regarding satisfaction of the perceived lane verification evaluation condition, |perception a 0 Map A 0 I is less than or equal to 0.18, and I senses A 1 Map A 1 I is less than or equal to 0.003, and I senses A 2 Map A 2 I is less than or equal to 0.00003, and I senses A 3 Map A 3 The I is less than or equal to 0.000003, wherein the perception Ax is a perception lane line equation coefficient, and x=0, 1,2 and 3; the map Ax is a map lane line equation coefficient, x=0, 1,2,3.
S307, if all the perceived lane lines indicated by the perceived lane line equation are detected to be output normally, all the perceived lane lines are used as the fused lane line equation;
specifically, if all the sensing lane lines indicated by the sensing lane line equation are detected to be normally output, all the sensing lane lines are preferentially used as the fusion lane line equation, and if partial normal output and partial abnormal output of the sensing lane lines indicated by the sensing lane line equation are detected and all the map lane lines indicated by the map lane line equation are normally output, the sensing lane line equation coefficient indicating abnormal output in the sensing lane line equation is replaced by the corresponding map lane line equation coefficient in the map lane line equation so as to complete the sensing lane line equation, and the completed sensing lane line equation is confirmed as the fusion lane line equation.
S308, if all the sensing lane lines indicated by the sensing lane line equation are detected to be abnormal in output, acquiring the output condition of the map lane lines with the preset frame number before the current moment;
specifically, if all the sensing lane lines indicated by the sensing lane line equation are detected to not meet the preset sensing confidence and lane line rationality conditions, and when all the sensing lane lines are confirmed to be abnormal in output, whether the map lane line equation is confirmed to be the fusion lane line equation or not is confirmed according to the output condition of the map lane lines of the previous history frame number.
S309, if all the map lane lines of the preset frame number indicated by the map lane line equation are normally output, determining the map lane line equation as the fusion lane line equation;
specifically, if the preset frame number indicated by the map lane line equation is detected, such as all map lane lines of the previous five frames are normally output, the map lane line equation is confirmed to be a fused lane line equation, if the output abnormality is detected in the map lane lines of the previous five frames, that is, if all the lane lines indicated by the perceived lane line equation and the map lane line equation are detected to be abnormal, the fused lane line equation is not confirmed, and new perceived lane line equation and map lane line equation are generated according to the acquired road perceived information and map data.
S310, acquiring the lane distances between other vehicles and each fusion lane line indicated by the fusion lane line equation, and confirming a lane formed by two fusion lane lines with the minimum lane distance as a lane corresponding to the other vehicles;
specifically, the position information of other vehicles acquired by the road sensing unit is substituted into a fused lane line equation, so that the lanes of the other vehicles are confirmed, namely the lanes of the other vehicles are screened, wherein the distance between the other vehicles and each corresponding lane line, namely the lane distance, is calculated according to the number of current lane lines and the fused lane line equation, the lanes are confirmed according to the lane width and the minimum lane distance, and if the minimum lane formed by adjacent fused lane lines with the same minimum lane distance is detected, the lane formed by the adjacent fused lane lines with the highest fused lane line confidence is confirmed as the lane corresponding to the other vehicles in the minimum lane.
S311, acquiring a planned running track according to the fusion lane line indicated by the fusion lane line equation and the other vehicles after the lane to which the vehicle belongs is determined, and realizing vehicle control according to the planned running track and the running information of the vehicles;
specifically, after confirming the lane to which the fusion lane indicated by the fusion lane equation and the other vehicles respectively belong, confirming the lane number of the vehicle according to the fusion lane, and acquiring the speed limit information of the lane to which the vehicle belongs and the front transverse and longitudinal gradient and curvature information, namely the attribute information of the lane, namely the function of monitoring the current road or the lane pavement information is realized, so that the control unit of the vehicle controls the vehicle based on the planned driving track and according to the attribute information including the lane curvature, the course and the gradient.
The application provides a vehicle perception fusion method based on a high-precision map, which is characterized in that a corresponding perception lane line equation is generated through fitting acquired road perception information, a road section lane model is established according to map data corresponding to a vehicle, after the lane line longitude and latitude point coordinates of the road section lane model are converted into a vehicle running coordinate system, the corresponding map lane line equation is generated through fitting in the running coordinate system, the fusion lane line equation is acquired according to the normally output perception lane line equation and the map lane line equation, the fusion of different source data information is realized, the relation between perception information of an intelligent driving system and high-precision map data is enriched, lanes of other vehicles are determined according to the fusion lane line equation and the acquired position information of other vehicles, a planned running track is acquired according to the fusion lane line and the other vehicles after the lane is determined, and the vehicle is controlled according to the planned running track and the running information of the vehicles, so that the perception robustness and the stability of the intelligent driving system of the vehicle are improved.
The embodiment of the application can divide the functional modules of the electronic device or the main control device according to the method example, for example, each functional module can be divided corresponding to each function, and two or more functions can be integrated in one processing unit. The integrated units may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
Fig. 4 is a schematic structural diagram of a vehicle sensing fusion device based on a high-precision map according to an embodiment of the present application. As shown in fig. 4, the apparatus 40 includes:
the first obtaining module 401 is configured to obtain road perception information through a road perception unit of a vehicle, and generate a corresponding perception lane line equation and a perception lane line confidence coefficient according to the road perception information in a fitting manner, where the perception lane line confidence coefficient is used to indicate the reliability of a perception lane line;
the second obtaining module 402 is configured to obtain map data according to positioning data of a vehicle, establish a road section lane model according to the map data, convert coordinates of longitude and latitude points of a lane line of the road section lane model into a running coordinate system of the vehicle, and perform fitting in the running coordinate system to generate a corresponding map lane line equation;
The processing module 403 is configured to obtain a fused lane line equation according to the sensed lane line equation and the map lane line equation that are normally output, and determine a lane to which the other vehicle belongs according to the fused lane line equation and the position information of the other vehicle obtained by the road sensing unit;
and the execution module 404 is used for acquiring a planned running track according to the fusion lane line indicated by the fusion lane line equation and the other vehicles after the lane to which the vehicle belongs is determined, and realizing vehicle control according to the planned running track and the running information of the vehicles.
Further, the processing module 403 is specifically configured to, before the fused lane line equation is obtained according to the perceived lane line equation and the map lane line equation, check each perceived lane line indicated by the perceived lane line equation, where if the lane line confidence is greater than a preset perceived confidence and a lane line rationality condition is satisfied, confirm that a corresponding perceived lane line indicated by the perceived lane line equation is normally output, and check each map lane line indicated by the map lane line equation, where if the positioning data satisfies a precision requirement, and the map lane line is smooth, and satisfies a lane line rationality condition and a perceived lane line verification evaluation condition, confirm that a corresponding map lane line indicated by the map lane line equation is normally output.
Further, the processing module 403 is specifically configured to obtain the lane line equation coefficients of the perceived lane line equation and the map lane line equation, obtain coefficient differences between each own frame lane line equation coefficient and the corresponding previous frame lane line equation coefficient, take absolute values of each coefficient difference, obtain each lane line equation coefficient change value, obtain the width difference between adjacent lane line equation coefficients, and take the absolute value of the width difference as the lane width, if the lane line equation coefficient is in a preset equation coefficient interval, the perceived lane line equation coefficient change value is not greater than the corresponding preset value, and when the lane width is in a preset width interval, confirm that the lane line rationality condition is satisfied.
Further, the processing module 403 is specifically configured to obtain lane line equation coefficients of the perceived lane line equation and the map lane line equation, obtain coefficient differences between each perceived lane line equation coefficient and a corresponding map lane line equation coefficient, use absolute values of the coefficient differences as check values, and confirm that the perceived lane line check evaluation condition is satisfied if each check value is not greater than each corresponding preset check value.
Further, the processing module 403 is specifically configured to take all the perceived lane lines as the fused lane line equation if all the perceived lane lines indicated by the perceived lane line equation are detected to be normally output, and take all the perceived lane lines as the fused lane line equation if all the perceived lane lines indicated by the perceived lane line equation are detected to have partial normal output and partial abnormal output, and the map lane lines indicated by the map lane line equation are normally output, replace the perceived lane line equation coefficient indicating the abnormal output in the perceived lane line equation by the corresponding map lane line equation coefficient in the map lane line equation, so as to complete the perceived lane line equation, and confirm the completed perceived lane line equation as the fused lane line equation, and obtain the fused lane line equation according to the normally output map lane line if all the perceived lane lines indicated by the perceived lane line equation are detected to be abnormal.
Further, the third module 403 is specifically configured to obtain an output condition of a map lane line of a preset frame number before the current time, if all the map lane lines of the preset frame number indicated by the map lane line equation are normally output, determine the map lane line equation as the fused lane line equation, and if all the lane lines indicated by the perceived lane line equation and the map lane line equation are detected to be abnormal, not determine the fused lane line equation.
Further, the third module 403 is specifically configured to obtain the lane distances between the other vehicles and the respective fusion lane lines indicated by the fusion lane line equation, determine the lane formed by the two fusion lane lines with the smallest lane distance as the lane corresponding to the other vehicles, and if the smallest lane formed by the adjacent fusion lane lines with the same smallest lane distance is detected, determine the lane formed by the adjacent fusion lane line with the highest fusion lane line confidence as the lane corresponding to the other vehicles in the smallest lane.
The vehicle perception fusion device based on the high-precision map provided in this embodiment may execute the vehicle perception fusion method based on the high-precision map in the above embodiment, and its implementation principle and technical effect are similar, and this embodiment will not be described here again.
In the specific implementation of the vehicle perception fusion device based on the high-precision map, each module may be implemented as a processor, and the processor may execute computer-executed instructions stored in the memory, so that the processor executes the vehicle perception fusion method based on the high-precision map.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 5, the electronic device 50 includes: at least one processor 501 and a memory 502. The electronic device 50 further comprises a communication part 503. The processor 501, the memory 502, and the communication unit 503 are connected via a bus 504.
In a specific implementation, the at least one processor 501 executes the computer-executable instructions stored in the memory 502, so that the at least one processor 501 executes the vehicle-aware fusion method based on the high-precision map as executed on the electronic device side.
The specific implementation process of the processor 501 may refer to the above-mentioned method embodiment, and its implementation principle and technical effects are similar, and this embodiment will not be described herein again.
In the above embodiment, it should be understood that the processor may be a central processing unit (english: central Processing Unit, abbreviated as CPU), or may be other general purpose processors, digital signal processors (english: digital Signal Processor, abbreviated as DSP), application specific integrated circuits (english: application Specific Integrated Circuit, abbreviated as ASIC), or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present invention may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in a processor for execution.
The memory may comprise high speed RAM memory or may further comprise non-volatile storage NVM, such as at least one disk memory.
The bus may be an industry standard architecture (Industry Standard Architecture, ISA) bus, an external device interconnect (Peripheral Component, PCI) bus, or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, among others. The buses may be divided into address buses, data buses, control buses, etc. For ease of illustration, the buses in the drawings of the present application are not limited to only one bus or to one type of bus.
The scheme provided by the embodiment of the application is introduced aiming at the functions realized by the electronic equipment and the main control equipment. It will be appreciated that the electronic device or the master device, in order to implement the above-described functions, includes corresponding hardware structures and/or software modules that perform the respective functions. The present embodiments can be implemented in hardware or a combination of hardware and computer software in combination with the various exemplary elements and algorithm steps described in connection with the embodiments disclosed in the embodiments of the present application. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application, but such implementation is not to be considered as beyond the scope of the embodiments of the present application.
The application also provides a computer readable storage medium, wherein the computer readable storage medium stores computer execution instructions, and when a processor executes the computer execution instructions, the vehicle perception fusion device method based on the high-precision map is realized.
The computer readable storage medium described above may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic disk, or optical disk. A readable storage medium can be any available medium that can be accessed by a general purpose or special purpose computer.
An exemplary readable storage medium is coupled to the processor such the processor can read information from, and write information to, the readable storage medium. In the alternative, the readable storage medium may be integral to the processor. The processor and the readable storage medium may reside in an application specific integrated circuit (Application Specific Integrated Circuits, ASIC for short). The processor and the readable storage medium may reside as discrete components in an electronic device or a master device.
The present application also provides a computer program product comprising: a computer program stored in a readable storage medium, from which at least one processor of an electronic device can read, the at least one processor executing the computer program causing the electronic device to perform the solution provided by any one of the embodiments described above.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the method embodiments described above may be performed by hardware associated with program instructions. The foregoing program may be stored in a computer readable storage medium. The program, when executed, performs steps including the method embodiments described above; and the aforementioned storage medium includes: various media that can store program code, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.

Claims (10)

1. A vehicle perception fusion method based on a high-precision map, the method comprising:
acquiring road perception information through a road perception unit of a vehicle, and generating a corresponding perception lane line equation and perception lane line confidence coefficient according to the road perception information in a fitting way, wherein the perception lane line confidence coefficient is used for indicating the reliability of the perception lane line;
acquiring map data according to positioning data of a vehicle, establishing a road section lane model according to the map data, converting the longitude and latitude point coordinates of a lane line of the road section lane model into a running coordinate system of the vehicle, and fitting in the running coordinate system to generate a corresponding map lane line equation;
acquiring a fused lane line equation according to the normally output perceived lane line equation and the map lane line equation, and determining lanes to which other vehicles belong according to the fused lane line equation and the position information of the other vehicles acquired by the road perception unit;
and acquiring a planned running track according to the fusion lane line indicated by the fusion lane line equation and the other vehicles after the lane to which the vehicle belongs, and controlling the vehicles according to the planned running track and the running information of the vehicles.
2. The method of claim 1, wherein prior to the obtaining a fused lane-line equation from the perceived lane-line equation and the map lane-line equation, the method further comprises:
checking each sensing lane line indicated by the sensing lane line equation, wherein if the lane line confidence is greater than a preset sensing confidence and the lane line rationality condition is met, the normal output of the corresponding sensing lane line indicated by the sensing lane line equation is confirmed;
and checking each map lane line indicated by the map lane line equation, wherein if the positioning data meets the accuracy requirement, the map lane line is smooth, and the map lane line rationality condition and the perceived lane line checking and evaluating condition are met, the corresponding map lane line indicated by the map lane line equation is confirmed to be normally output.
3. The method of claim 2, wherein the meeting lane-line rationality conditions comprises:
obtaining lane line equation coefficients of the perceived lane line equation and the map lane line equation;
the method comprises the steps of obtaining coefficient difference values of each frame lane line equation coefficient and the corresponding previous frame lane line equation coefficient, taking absolute values of the coefficient difference values, and obtaining the change value of each lane line equation coefficient;
Acquiring a width difference value of the adjacent lane line equation coefficients, and taking an absolute value of the width difference value as a lane width;
and if the lane line equation coefficient is in a preset equation coefficient interval, determining that the lane line rationality condition is met when the change value of each sensing lane line equation coefficient is not larger than the corresponding preset value and the lane width is in a preset width interval.
4. The method of claim 2, wherein satisfying the perceived lane line verification evaluation condition comprises:
obtaining lane line equation coefficients of the perceived lane line equation and the map lane line equation;
and acquiring coefficient difference values of each sensing lane line equation coefficient and the corresponding map lane line equation coefficient, taking the absolute value of each coefficient difference value as a check value, and if each check value is not greater than each corresponding preset check value, confirming that the sensing lane line check evaluation condition is met.
5. The method of claim 1, wherein the obtaining a fused lane-line equation from the perceived lane-line equation and the map lane-line equation of a normal output comprises:
if all the perceived lane lines indicated by the perceived lane line equation are detected to be normally output, all the perceived lane lines are used as the fused lane line equation;
If partial normal output and partial abnormal output of the sensing lane line indicated by the sensing lane line equation are detected, and all the map lane lines indicated by the map lane line equation are normally output, replacing the sensing lane line equation coefficient indicating abnormal output in the sensing lane line equation by the corresponding map lane line equation coefficient in the map lane line equation so as to complete the sensing lane line equation, and confirming the completed sensing lane line equation as the fused lane line equation;
if all the sensing lane lines indicated by the sensing lane line equation are detected to be abnormal in output, acquiring a fusion lane line equation according to the map lane lines which are normally output.
6. The method of claim 5, wherein if all the perceived lane lines indicated by the perceived lane line equations are detected to be output abnormally, acquiring a fused lane line equation according to the map lane lines that are output normally, comprising:
acquiring the output condition of a map lane line of a preset frame number before the current moment;
if all the map lane lines of the preset frame number indicated by the map lane line equation are normally output, determining the map lane line equation as the fused lane line equation;
If the fact that all the lane lines indicated by the sensing lane line equation and the map lane line equation are abnormal in output is detected, the fusion lane line equation is not confirmed.
7. The method according to claim 1, wherein the determining the lane to which the other vehicle belongs according to the fused lane-line equation and the position information of the other vehicle acquired by the road sensing unit includes:
obtaining the lane distances between other vehicles and each fusion lane line indicated by the fusion lane line equation, and confirming a lane formed by two fusion lane lines with the minimum lane distance as a lane corresponding to the other vehicles;
if the smallest lane formed by the adjacent fusion lane lines with the same smallest lane distance is detected, determining the lane formed by the adjacent fusion lane lines with the highest fusion lane line confidence as the lane corresponding to the other vehicles in the smallest lane.
8. A vehicle perception fusion device based on a high-precision map, comprising:
the first acquisition module is used for acquiring road perception information through a road perception unit of the vehicle, and generating a corresponding perception lane line equation and perception lane line confidence coefficient according to the road perception information in a fitting way, wherein the perception lane line confidence coefficient is used for indicating the reliability of the perception lane line;
The second acquisition module is used for acquiring map data according to the positioning data of the vehicle, establishing a road section lane model according to the map data, converting the longitude and latitude point coordinates of a lane line of the road section lane model into a running coordinate system of the vehicle, and fitting in the running coordinate system to generate a corresponding map lane line equation;
the processing module is used for acquiring a fused lane line equation according to the normally output sensing lane line equation and the map lane line equation, and determining lanes of other vehicles according to the fused lane line equation and the position information of the other vehicles acquired by the road sensing unit;
and the execution module is used for acquiring a planned running track according to the fusion lane line indicated by the fusion lane line equation and the other vehicles after determining the belonged lane, and realizing vehicle control according to the planned running track and the running information of the vehicles.
9. An electronic device, comprising: a processor, and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored in the memory to implement the method of any one of claims 1 to 7.
10. A computer readable storage medium having stored therein computer executable instructions which when executed by a processor are adapted to carry out the method of any one of claims 1 to 7.
CN202311033962.2A 2023-08-16 2023-08-16 Vehicle perception fusion method, device and storage medium based on high-precision map Pending CN117029857A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311033962.2A CN117029857A (en) 2023-08-16 2023-08-16 Vehicle perception fusion method, device and storage medium based on high-precision map

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311033962.2A CN117029857A (en) 2023-08-16 2023-08-16 Vehicle perception fusion method, device and storage medium based on high-precision map

Publications (1)

Publication Number Publication Date
CN117029857A true CN117029857A (en) 2023-11-10

Family

ID=88644548

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311033962.2A Pending CN117029857A (en) 2023-08-16 2023-08-16 Vehicle perception fusion method, device and storage medium based on high-precision map

Country Status (1)

Country Link
CN (1) CN117029857A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117490728A (en) * 2023-12-28 2024-02-02 合众新能源汽车股份有限公司 Lane line positioning fault diagnosis method and system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117490728A (en) * 2023-12-28 2024-02-02 合众新能源汽车股份有限公司 Lane line positioning fault diagnosis method and system
CN117490728B (en) * 2023-12-28 2024-04-02 合众新能源汽车股份有限公司 Lane line positioning fault diagnosis method and system

Similar Documents

Publication Publication Date Title
CN109669202B (en) Position estimating apparatus and method
JP6469220B2 (en) Traveling lane discrimination device and traveling lane discrimination method
CN111380539A (en) Vehicle positioning and navigation method and device and related system
WO2020199566A1 (en) Method and apparatus for updating matching relationship between navigation map and perception image
CN111102988A (en) Map-based path planning method, server, vehicle-mounted terminal, and storage medium
JP6810257B2 (en) Methods and systems for locating vehicles
CN117029857A (en) Vehicle perception fusion method, device and storage medium based on high-precision map
EP2482036A2 (en) Course guidance system, course guidance method, and course guidance program
CN111272190A (en) Map calibration error detection method and device
JP2018529954A (en) Method for updating an electronic map of a vehicle
JP2020056785A (en) Method and device for operating vehicle
CN111194397B (en) Method for operating a navigation system
JP5208016B2 (en) Merge / Exit Determination Device and Merge / Exit Determination Program
JP2009069900A (en) Vehicular navigation device
WO2010101199A1 (en) Road traffic information creation device and road traffic information creation method
CN116416588A (en) Lane line prediction method, lane line prediction device, electronic equipment and storage medium
CN116295490A (en) Vehicle positioning method and device, electronic equipment and storage medium
US20200385008A1 (en) Safety-Aware Comparator for Redundant Subsystems in Autonomous Vehicles
JP2017156112A (en) Vehicle position detection device
CN115112125A (en) Positioning method and device for automatic driving vehicle, electronic equipment and storage medium
CN115683124A (en) Method for determining a driving trajectory
CN115328893A (en) Data processing method, device, equipment and computer storage medium
CN113183988A (en) Method, device and equipment for supervising automatic driving of vehicle and storage medium
CN116027375B (en) Positioning method and device for automatic driving vehicle, electronic equipment and storage medium
JP2005091071A (en) Road information learning system and road information learning method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination