CN117490727A - Positioning accuracy evaluation method and device and electronic equipment - Google Patents

Positioning accuracy evaluation method and device and electronic equipment Download PDF

Info

Publication number
CN117490727A
CN117490727A CN202311811573.8A CN202311811573A CN117490727A CN 117490727 A CN117490727 A CN 117490727A CN 202311811573 A CN202311811573 A CN 202311811573A CN 117490727 A CN117490727 A CN 117490727A
Authority
CN
China
Prior art keywords
lane line
average value
sensing
positioning accuracy
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311811573.8A
Other languages
Chinese (zh)
Other versions
CN117490727B (en
Inventor
张瑜
倪宏杰
李欣博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hozon New Energy Automobile Co Ltd
Original Assignee
Hozon New Energy Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hozon New Energy Automobile Co Ltd filed Critical Hozon New Energy Automobile Co Ltd
Priority to CN202311811573.8A priority Critical patent/CN117490727B/en
Publication of CN117490727A publication Critical patent/CN117490727A/en
Application granted granted Critical
Publication of CN117490727B publication Critical patent/CN117490727B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • G01S19/23Testing, monitoring, correcting or calibrating of receiver elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Manufacturing & Machinery (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Navigation (AREA)

Abstract

The invention provides a positioning accuracy evaluation method, a positioning accuracy evaluation device and electronic equipment, wherein the method comprises the following steps: lane lines on two sides of a perception target vehicle are respectively a first perception lane line and a second perception lane line; under the world coordinate system, a plurality of sampling points are respectively selected on a first perception lane line and a second perception lane line; calculating a first average value of Y-axis deviations of sampling points on a first sensing lane line, and calculating a second average value of Y-axis deviations of sampling points on a second sensing lane line, wherein the Y-axis deviations are differences between Y-axis values of corresponding map points on a map lane line when the sampling points on the sensing lane line are projected to the map, and the map lane line is a lane line corresponding to the sensing lane line on the map; and calculating the average value of the first average value and the second average value to obtain a third average value, and taking the third average value as a first evaluation index of the positioning accuracy. The invention does not need to evaluate the positioning precision by means of a true value, does not need expensive true value equipment, and is more convenient and quick.

Description

Positioning accuracy evaluation method and device and electronic equipment
Technical Field
The invention mainly relates to the technical field of positioning data measurement and processing, in particular to a positioning accuracy evaluation method, a positioning accuracy evaluation device and electronic equipment.
Background
The unmanned automobile is an intelligent automobile, and mainly depends on an intelligent driving system which mainly comprises a computer system in the automobile to realize unmanned operation. The unmanned automobile integrates a plurality of artificial intelligence technologies such as automatic control, environment interaction, visual recognition and the like, and is a highly developed product of computer science, pattern recognition and intelligent control technologies. The unmanned automobile is an intelligent automobile which senses the road environment through a vehicle-mounted sensing system, automatically plans a driving route and controls the automobile to reach a preset target, senses the surrounding environment of the automobile through a vehicle-mounted sensor, and controls the steering and the speed of the automobile according to the road, the automobile position and the obstacle information obtained through sensing, so that the automobile can safely and reliably run on the road.
At present, the intelligent driving technology of mass production landing mainly focuses on the level L2+, and in the high-order navigation assisted driving, intelligent driving algorithms are divided into sensing, positioning and regulation algorithms. Positioning plays an important role in intelligent driving, high-precision positioning can help to realize multi-scene landing, such as full-automatic driving, automatic parking and the like of expressway scenes, and meanwhile, the positioning can also cooperate with technologies such as an electronic horizon of an ADAS (Advanced Driver Assistance Systems, advanced driving assistance system) and the like to help a vehicle to realize vehicle body control under the scene that vision cannot be judged, so that fuel saving and better driving assistance experience are brought.
When the positioning algorithm is simulated, a set of better positioning algorithm evaluation indexes are needed to judge the precision of the simulation result. In the existing positioning algorithm evaluation mode, the positioning result obtained by a sensor with better precision is usually used as a true value, and then the true value is used for calculating root mean square error or other indexes, but the positioning accuracy is required to be evaluated by the true value in the mode, and in an actual engineering scene, the true value or the true value equipment is usually not expensive.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a positioning precision evaluation method, a positioning precision evaluation device and electronic equipment, wherein the positioning precision is not required to be evaluated by means of a true value, expensive true value equipment is not required, and the positioning precision evaluation method and the positioning precision evaluation device are more convenient and faster.
In order to solve the above technical problems, in a first aspect, the present invention provides a positioning accuracy evaluation method, including: sensing lane lines on two sides of a target vehicle to obtain a sensing lane line, wherein the sensing lane line is a first sensing lane line and a second sensing lane line respectively; under a world coordinate system, respectively selecting a plurality of sampling points on the first sensing lane line and the second sensing lane line; calculating a first average value of Y-axis deviations of all sampling points on the first perception lane line; calculating a second average value of Y-axis deviations of all sampling points on the second perception lane line; the Y-axis deviation is the difference between Y-axis values of the corresponding map points on the map lane line when the sampling points on the sensing lane line are projected to the map, and the map lane line is the lane line corresponding to the sensing lane line on the map; and calculating the average value of the first average value and the second average value to obtain a third average value, and taking the third average value as a first evaluation index of the positioning accuracy.
Optionally, if p third average values are obtained in the statistical time, calculating the ratio of the sum of the p third average values to the statistical time to obtain a second evaluation index, and/or calculating the ratio of the sum of the p third average values to the statistical distance to obtain a third evaluation index; wherein the statistical time is from the beginning of calculating the first third average value to the ending of calculating the last third average value, and the statistical distance is the distance change of the target vehicle after the statistical time.
Optionally, in the step of sensing lane lines on both sides of the target vehicle, lane lines on both sides of the target vehicle closest to the target vehicle are the first sensing lane line and the second sensing lane line, respectively.
Optionally, the method further comprises: and converting the positioning coordinates of the target vehicle and the coordinates of the sensing lane line under a world coordinate system.
Optionally, the method further comprises: and determining a first end and a second end along the direction of the sensing lane line, wherein the distance between the second end and the target vehicle is larger than the distance between the first end and the target vehicle, and all sampling points are between the first end and the second end.
Optionally, both the first end and the second end are forward or rearward of the target vehicle.
Optionally, the method further comprises: at the current time, a first difference S of each marker is calculated 1 =S d -S o Wherein S is d S, sensing the area of the boundary frame for the marker obtained through a sensing algorithm o An area overlapping a bounding box of the marker on the map when projected onto the map; calculating a first difference S of each marker at the current moment 1 Average value of (2) to obtain a second difference S 2 With the second difference S 2 As a fourth evaluation index of the positioning accuracy.
Optionally, the method further comprises: select area S o Objects greater than an area threshold are used as the markers.
Optionally, the method further comprises: if q second differences S are obtained in the statistical time 2 Then calculate the q second differences S 2 Obtaining a fifth evaluation index by the ratio of the sum to the statistical time and/or calculating the q second difference values S 2 Obtaining a sixth evaluation index by the ratio of the sum to the statistical distance; wherein the statistical time is calculated by calculating a first and a second difference S from the beginning 2 To the end of calculating the last second difference S 2 And when the statistical distance is used, the statistical distance is the distance change of the target vehicle after the statistical time.
Optionally, the method further comprises: setting a seventh evaluation index which is a weighted average of the first evaluation index and the fourth evaluation index under different weights.
Optionally, the weight of the first evaluation index is greater than the weight of the fourth evaluation index.
In a second aspect, the present invention provides a positioning accuracy evaluation device, comprising: the sensing module is used for sensing lane lines on two sides of the target vehicle to obtain sensing lane lines, namely a first sensing lane line and a second sensing lane line; the sampling module is used for respectively selecting a plurality of sampling points on the first sensing lane line and the second sensing lane line under a world coordinate system; the first calculation module is used for calculating a first average value of Y-axis deviation of each sampling point on the first perception lane line; calculating a second average value of Y-axis deviations of all sampling points on the second perception lane line; the Y-axis deviation is the difference between Y-axis values of the corresponding map points on the map lane line when the sampling points on the sensing lane line are projected to the map, and the map lane line is the lane line corresponding to the sensing lane line on the map; and the second calculation module is used for calculating the average value of the first average value and the second average value to obtain a third average value, and the third average value is used as a first evaluation index of the positioning accuracy.
Optionally, the method further comprises: a third calculation module for calculating a first difference S of each marker at the current moment 1 =S d -S o Wherein S is d S, sensing the area of the boundary frame for the marker obtained through a sensing algorithm o An area overlapping a bounding box of the marker on the map when projected onto the map; a fourth calculation module for calculating a first difference value S of each marker at the current moment 1 Average value of (2) to obtain a second difference S 2 With the second difference S 2 As a fourth evaluation index of the positioning accuracy.
In a third aspect, the present invention provides an electronic device, comprising: a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the positioning accuracy evaluation method according to the first aspect.
In a fourth aspect, the present invention provides a readable storage medium having stored thereon a program or instructions which, when executed by a processor, implement the steps of the positioning accuracy evaluation method according to the first aspect.
Compared with the prior art, the invention has the following advantages: firstly, lane lines on two sides of a target vehicle are respectively a first sensing lane line and a second sensing lane line, then a plurality of sampling points are respectively selected on the first sensing lane line and the second sensing lane line under a world coordinate system, a first average value of Y-axis deviation of each sampling point on the first sensing lane line is calculated, a second average value of Y-axis deviation of each sampling point on the second sensing lane line is calculated, finally, an average value of the first average value and the second average value is calculated to obtain a third average value, the third average value is used as a first evaluation index of positioning accuracy, and expensive truth equipment is not needed because the positioning accuracy is not evaluated by virtue of a truth value, and the method is more convenient and faster.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the principles of the application. In the accompanying drawings:
FIG. 1 is a flow chart of a positioning accuracy evaluation method according to an embodiment of the present invention;
FIG. 2 is a diagram showing the calculation of Y-axis deviation in a positioning accuracy evaluation method according to an embodiment of the present invention;
FIG. 3 is a flow chart of a positioning accuracy evaluation method according to another embodiment of the present invention;
FIG. 4 is a view showing the calculated area deviation in a positioning accuracy evaluation method according to another embodiment of the present invention;
FIG. 5 is a schematic view showing the structure of a positioning accuracy evaluation apparatus according to an embodiment of the present invention;
FIG. 6 is a schematic view of a positioning accuracy evaluation apparatus according to another embodiment of the present invention;
fig. 7 is a schematic diagram of an electronic device according to an embodiment of the invention.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present application, and it is obvious to those skilled in the art that the present application may be applied to other similar situations according to the drawings without inventive effort. Unless otherwise apparent from the context of the language or otherwise specified, like reference numerals in the figures refer to like structures or operations.
As used in this application and in the claims, the terms "a," "an," "the," and/or "the" are not specific to the singular, but may include the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
The relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present application unless it is specifically stated otherwise. Meanwhile, it should be understood that the sizes of the respective parts shown in the drawings are not drawn in actual scale for convenience of description. Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but should be considered part of the specification where appropriate. In all examples shown and discussed herein, any specific values should be construed as merely illustrative, and not a limitation. Thus, other examples of the exemplary embodiments may have different values. It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
In addition, the terms "first", "second", etc. are used to define the components, and are merely for convenience of distinguishing the corresponding components, and unless otherwise stated, the terms have no special meaning, and thus should not be construed as limiting the scope of the present application. Furthermore, although terms used in the present application are selected from publicly known and commonly used terms, some terms mentioned in the specification of the present application may be selected by the applicant at his or her discretion, the detailed meanings of which are described in relevant parts of the description herein. Furthermore, it is required that the present application be understood, not simply by the actual terms used but by the meaning of each term lying within.
Flowcharts are used in this application to describe the operations performed by systems according to embodiments of the present application. It should be understood that the preceding or following operations are not necessarily performed in order precisely. Rather, the various steps may be processed in reverse order or simultaneously. At the same time, other operations are added to or removed from these processes.
Embodiment one: fig. 1 is a flowchart of a positioning accuracy evaluation method according to an embodiment of the present invention, referring to fig. 1, a method 100 includes: s110, sensing lane lines on two sides of a target vehicle to obtain a sensing lane line, wherein the sensing lane line is a first sensing lane line and a second sensing lane line respectively; s120, respectively selecting a plurality of sampling points on the first sensing lane line and the second sensing lane line under a world coordinate system; s130, calculating a first average value of Y-axis deviation of each sampling point on the first perception lane line; calculating a second average value of Y-axis deviations of all sampling points on the second perception lane line; the Y-axis deviation is the difference between Y-axis values of the corresponding map points on the map lane line when the sampling points on the sensing lane line are projected to the map, and the map lane line is the lane line corresponding to the sensing lane line on the map; and S140, calculating the average value of the first average value and the second average value to obtain a third average value, and taking the third average value as a first evaluation index of the positioning accuracy.
In the present embodiment, the first evaluation index is also referred to as a lateral deviation index (Lateral Deviation, LD), which is obtained by calculating a lateral line deviation (i.e., a difference in Y-axis values) between the perceived lane line and the map lane line.
In an example, in the step of sensing lane lines on both sides of the target vehicle, lane lines on both sides of the target vehicle closest to the target vehicle are a first sensed lane line and a second sensed lane line, respectively. When the vehicle runs or stops on the lane, the perceived lane lines are different according to the road conditions, if the road is wider and the lanes are more, the vehicle can perceive more lanes and the perceived lane lines are more. In theory, any lane line on the road perceived by the vehicle may be used as the lane lines on both sides of the vehicle in the present embodiment, but it is understood that the farther from the vehicle, the lower the positioning accuracy, which adversely affects the objective evaluation of the positioning accuracy. Therefore, the lane lines on both sides of the vehicle closest to the vehicle are selected as the first perceived lane line and the second perceived lane line in the embodiment.
In one example, the present embodiment method further includes converting the location coordinates of the target vehicle and the coordinates of the perceived lane line under a world coordinate system. For example, global satellite positioning system (GPS) measurements employ a sphere center coordinate system (also known as a centroid coordinate system), while classical geodetic measurements employ a geodetic coordinate system, i.e. with the center of a reference ellipsoid as origin. The ellipsoid parameters used by the two coordinate systems and the origin of the coordinate systems are different, so that a certain error is caused between the observed value and the known value. Therefore, the conversion of the positioning coordinates and the coordinates of the perceived lane lines into the world coordinate system is beneficial to better docking with the basic scale map and improves the positioning precision and accuracy.
In an example, the first end and the second end may also be determined along the perceived lane line direction, with the second end being at a greater distance from the target vehicle than the first end, with all sampling points being between the first end and the second end. More preferably, both the first end and the second end are forward or rearward of the target vehicle.
In this embodiment, the first end and the second end are determined, and the sampling points are selected, so that the sampling points can be controlled in a centralized range, and objective evaluation of positioning accuracy is facilitated. In addition, in the positioning accuracy evaluation, the first end and the second end are both selected to be on the same side of the target vehicle, so that the influence of sampling on the evaluation method of the embodiment can be reduced, and the positioning accuracy evaluation is more accurate and objective.
For example, fig. 2 is a diagram showing the Y-axis deviation calculated in the positioning accuracy evaluation method according to an embodiment of the present invention, and referring to fig. 2, the positioning accuracy evaluation method is as follows:
1) Firstly, lane lines on two sides of a vehicle are perceived, and perceived lane lines which are a first perceived lane line P1 and a second perceived lane line P2 are obtained.
2) The positioning coordinates and the two perceived lane lines are converted from the vehicle body coordinate system to the world coordinate system. If the location coordinates and perceived lane line coordinates are in the world coordinate system, this step may be omitted.
3) The first end (i.e. proximal end x=d1) and the second end (i.e. distal end x=d2) are set separately, and reasonable distances can be referred to d1=2m, d2=10m. Assuming that the vehicle rear wheel position is o point, the vehicle head position is d1, and the distance between d2 and d1 is about two parking spaces. Because the far-end sensing lane line is generally poor in sensing quality, lane accuracy deviation can be caused if the far-end distance is too far, and calculation is inaccurate.
4) Referring to fig. 2, at the current time t1, a plurality of sampling points are respectively selected on the first sensing lane line P1 and the second sensing lane line P2. As can be seen from the figure, the coordinates of the sampling points on the first sensing lane line P1 are (X1, y11 '), (X2, y 12'), (xn, y1n '), and so on, and the coordinates of the sampling points on the second sensing lane line P2 are (X1, y 21'), (X2, y22 '), (xn, y2 n'), and so on, respectively, and of course, when the sampling points are selected, the X-axis coordinate values of the sampling points on the first sensing lane line P1 and the sampling points on the second sensing lane line P2 may be different, without affecting the essential content of the embodiment, and when the same X-axis coordinate is selected, the calculation is simpler.
It can also be seen from fig. 2 that, after the perceived lane line is projected onto the map, the perceived lane line cannot completely coincide with the map lane line actually corresponding thereto due to the influence of the positioning deviation, so it can be seen that the perceived lane line has a certain deviation (or deviation) from the map lane line. The lane line corresponding to the first perceived lane line P1 is indicated by a first map lane line Z1, and the lane line corresponding to the second perceived lane line P2 is indicated by a second map lane line Z2.
5) The number of sampling points on each sensing lane line is n (for example, n=10, 20 or 30, etc.), and a first sensing vehicle is calculatedAn average value of Y-axis deviations between n sampling points on the lane line P1 and the corresponding map points on the first map lane line Z1, i.e., a first average value dis 1= (dis) y11 + dis y21 + … + dis yn1 ) /n, where dis y11 、dis y21 、…、dis yn1 Y-axis deviations of n sampling points on the first perception lane line are respectively; calculating an average value of Y-axis deviations between n sampling points on the second perceived lane line P2 and corresponding map points on the second map lane line Z2, namely a second average value dis2 = (dis) y12 + dis y22 + … + dis yn2 ) /n, where dis y12 、dis y22 、…、dis yn2 Y-axis deviations of n sampling points on the second perception lane line are respectively; the total average deviation at time t1, i.e. the third average dis, is calculated again t = (dis1+dis2)/2. In dis t As a first evaluation index.
In an example, the method of this embodiment further includes calculating a ratio of a sum of the p third averages to the statistical time to obtain the second evaluation index, and/or calculating a ratio of a sum of the p third averages to the statistical distance to obtain the third evaluation index if the p third averages are obtained in the statistical time. The statistical time is from the time of calculating the first third average value to the time of calculating the last third average value, and the statistical distance is the distance change of the target vehicle after the statistical time.
For example, if T is the time from the start of calculating the first third average value to the end of calculating the last third average value, D is the vehicle travel distance (i.e., statistical distance), and p third average values are obtained in total, the first evaluation index, i.e., the second evaluation index, in the time dimension can be calculated and recorded as LD t ,LD t =(dis t1 + dis t2 + … + dis tp ) T in m/s, where dis t1 、 dis t2 、… 、dis tp P third averages, respectively. Calculating a first, third, and fourth evaluation index in the distance dimension, denoted LD d ,LD d =(dis t1 + dis t2 + … + dis tp )/ D,The unit is m/m.
In general, when the number of sampling points of the first evaluation index (lateral deviation index) is 10, statistics is performed once every 100ms, LD t Can refer to 1 m/s, LD d Reference may be made to 1/50 m/m.
According to the positioning accuracy evaluation method provided by the embodiment, lane lines on two sides of a target vehicle are firstly perceived and are respectively a first perceived lane line and a second perceived lane line, a plurality of sampling points are selected on the first perceived lane line and the second perceived lane line under a world coordinate system, a first average value of Y-axis deviation of each sampling point on the first perceived lane line is calculated, a second average value of Y-axis deviation of each sampling point on the second perceived lane line is calculated, and finally, the average value of the first average value and the second average value is calculated to obtain a third average value, and the third average value is used as a first evaluation index of positioning accuracy.
Embodiment two: fig. 3 is a flowchart of a positioning accuracy evaluation method according to another embodiment of the present invention, and referring to fig. 3, a method 300 includes: s110, sensing lane lines on two sides of a target vehicle to obtain a sensing lane line, wherein the sensing lane line is a first sensing lane line and a second sensing lane line respectively; s120, respectively selecting a plurality of sampling points on the first sensing lane line and the second sensing lane line under a world coordinate system; s130, calculating a first average value of Y-axis deviation of each sampling point on the first perception lane line; calculating a second average value of Y-axis deviations of all sampling points on the second perception lane line; the Y-axis deviation is the difference between Y-axis values of the corresponding map points on the map lane line when the sampling points on the sensing lane line are projected to the map, and the map lane line is the lane line corresponding to the sensing lane line on the map; s140, calculating an average value of the first average value and the second average value to obtain a third average value, wherein the third average value is used as a first evaluation index of the positioning accuracy; s310, calculating a first difference S of each marker at the current moment 1 =S d -S o Wherein S is d S, sensing the area of the boundary frame for the marker obtained through a sensing algorithm o An area overlapping a bounding box of the marker on the map when projected onto the map; s320, calculating a first difference value S of each marker at the current moment 1 Average value of (2) to obtain a second difference S 2 With the second difference S 2 As a fourth evaluation index of the positioning accuracy.
In the present embodiment, the fourth evaluation index is also referred to as a longitudinal deviation index (Vertical Deviation, VD), which is obtained by calculating the area deviation between the perceived marker and the map marker in the high-definition map.
In one example, an area S is selected o Objects greater than the area threshold serve as markers. In selecting the markers, not all objects that can be perceived can be used as markers for the accuracy of the evaluation method of the present embodiment. On the one hand, some objects are too small, the positioning error is relatively large or the perceived quality is poor, and the positioning accuracy is difficult to truly reflect; on the other hand, the overlapping part of the perceived object and the map marker is smaller, and the positioning precision cannot be accurately reflected; in addition, since the method of the embodiment needs to compare the area of the sensing marker with that of the map marker, the map is also required to have the corresponding marker, otherwise even if the sensing algorithm senses the marker, the map has no corresponding marker, and the comparison basis is not available. Based on this, the present embodiment needs to select the overlapping area S o An object larger than the area threshold is used as a marker, wherein the area threshold can be determined according to actual conditions, positioning accuracy and the size of the perceived marker need to be comprehensively considered, and the method is not particularly limited.
For example, a high-definition map may be selected as a map (high-definition map) to be compared in this embodiment, and the marker may be a traffic marker (traffic sign), which is used when the Shape (Shape) of the traffic marker is of the Bounding Box type, and the provided information is four boundary points of the Bounding Box, through which the position of the map traffic marker on the map can be obtained.
The perception of the markers may employ some sophisticated perception algorithms. The sensing algorithm inputs radar point cloud information and camera picture information through the deep learning model, outputs the position of the corresponding traffic marker under the vehicle coordinate system, outputs the type of 4 boundary point position information containing the traffic marker and the traffic marker, and projects the corresponding sensing traffic marker into a map (coordinate system) by using the position information output by the current positioning algorithm, so that the overlapping area of the sensing traffic marker and the map traffic marker can be calculated for subsequent calculation or evaluation.
For example, referring to fig. 4, fig. 4 is a view showing calculated area deviation in a positioning accuracy evaluation method according to another embodiment of the present invention, the positioning accuracy evaluation method is as follows:
1) Setting the near end x=d1 and the far end x=d2, filtering out the markers (such as traffic lights, signs, lamp poles and the like) of the main lane between d1 and d2 under the vehicle coordinate system, marking the map markers as Mi, and marking the perception markers as Nj, i > =0 and j > =0. Since the vehicle is running all the time, more marks can be perceived on the road, and the same mark is not in one-to-one correspondence with the sequence of the map marks on the map and the sequence of the perceived marks in the perception algorithm, i is used for representing the ith map mark, and j is used for representing the jth perceived mark.
2) The perceived marker Nj filtered out of the positioning result is converted from the vehicle body coordinate system to the world coordinate system (this step may be omitted if the perceived marker is the world coordinate system itself).
3) Assuming that the boundary box of the map marker and the boundary box perceived by the perceived marker are partially or completely overlapped on the map at the current time t1, assuming that Mi and Nj with the largest overlapped area are matched, the number of successfully matched Mi is recorded as k, and the area of Mi is recorded as S m The area of Nj is S d The overlapping area of Mi and Nj is S o Then the area deviation of Mi and Nj (i.e. the first difference) S 1 = S d -S o . As shown in the figure, the dashed box in the figure represents the perceived marker area S d Perception markers and map markers overlapIs the area S of the hatched area in the figure 0 It can be seen that the area deviation is the area S of the blank in the boundary box of the perception marker 1
As described above, if k markers are matched, the average area deviation (second difference) S at time t1 is finally obtained 2 = (S 11 + S 12 +... +S 1(1-k) +S 1k ) /k, wherein S 11 、S 12 、...、S 1(1-k) 、S 1k Respectively represent k first differences, and the second differences S 2 As a fourth evaluation index.
In an example, the method of this embodiment further includes: if q second difference values S2 are obtained in the statistical time, q second difference values S are calculated 2 Obtaining a fifth evaluation index by the ratio of the sum to the statistical time and/or calculating q second difference values S 2 Obtaining a sixth evaluation index by the ratio of the sum to the statistical distance; wherein the statistical time is calculated by calculating a first and a second difference S from the beginning 2 To the end of calculating the last second difference S 2 When the method is used, the statistical distance is the distance change of the target vehicle after the statistical time.
Exemplary, if the first and second differences S are calculated from the beginning 2 To the end of calculating the last second difference S 2 When the vehicle is used as T, the running distance (i.e. the statistical distance) of the vehicle is D, and q second difference values are obtained in total, the fourth evaluation index, i.e. the fifth evaluation index, in the time dimension can be calculated and recorded as VD t ,VD t =(S 21 + S 22 + … + S 2q ) T, in m 2 S, where S 21 、S 22 、…、S 2q Q second differences. Calculating a fourth evaluation index, namely a sixth evaluation index, in the distance dimension, and recording VD d ,VD d =(S 21 + S 22 + … + S 2q ) D, unit is m 2 /m。
In general, when the number of sampling points of the fourth evaluation index (longitudinal deviation index) is 10, the statistics is performed once every 100ms, and VD t Can be referred to 1m 2 /s,VD d Can be referred to 1/50 m 2 /m。
In an example, the method of this embodiment may further include setting a seventh evaluation index that is a weighted average of the first evaluation index and the fourth evaluation index under different weights. In terms of positioning accuracy of the positioning result, weight distribution is carried out on the first evaluation index and the fourth evaluation index under different application scenes, and positioning accuracy deviation indexes (Accuracy Deviation, AD) are obtained.
In one example, the first rating index has a greater weight than the fourth rating index. In the positioning accuracy evaluation process of this embodiment, the lateral deviation is generally more focused, so the weight of the first evaluation index is greater than that of the fourth evaluation index. For example, the first evaluation index is given a weight of 0.7, the fourth evaluation index is given a weight of 0.3, and the sum of the weight of the first evaluation index and the weight of the fourth evaluation index is 1.
Details of other operations performed by the steps in this embodiment may refer to the same details in the foregoing embodiments, and will not be further described herein.
The positioning accuracy evaluation method provided by the embodiment does not need to evaluate the positioning accuracy by means of a true value, does not need expensive true value equipment, is more convenient and quick, can evaluate not only the transverse deviation, but also the longitudinal deviation, and comprehensively evaluate the transverse deviation and the longitudinal deviation, and more comprehensively evaluate the positioning accuracy.
Embodiment III: fig. 5 is a schematic structural diagram of a positioning accuracy evaluation device according to an embodiment of the present invention, and referring to fig. 5, a device 500 includes: the sensing module 501 is configured to sense lane lines on two sides of a target vehicle, and obtain a sensed lane line, which is a first sensed lane line and a second sensed lane line respectively; the sampling module 502 is configured to select a plurality of sampling points on the first perceived lane line and the second perceived lane line respectively in a world coordinate system; a first calculating module 503, configured to calculate a first average value of Y-axis deviations of the sampling points on the first perceived lane line; calculating a second average value of Y-axis deviations of all sampling points on the second perception lane line; the Y-axis deviation is the difference between Y-axis values of the corresponding map points on the map lane line when the sampling points on the sensing lane line are projected to the map, and the map lane line is the lane line corresponding to the sensing lane line on the map; and a second calculating module 504, configured to calculate an average value of the first average value and the second average value, obtain a third average value, and use the third average value as a first evaluation index of the positioning accuracy.
In an example, the apparatus 500 may further include a first index optimization module configured to calculate a ratio of a sum of the p third averages to the statistical time to obtain a second evaluation index, and/or calculate a ratio of the sum of the p third averages to the statistical distance to obtain a third evaluation index if the p third averages are obtained in the statistical time; the statistical time is from the time of calculating the first third average value to the time of calculating the last third average value, and the statistical distance is the distance change of the target vehicle after the statistical time.
In an example, in the process of sensing lane lines on both sides of the target vehicle, lane lines on both sides of the target vehicle closest to the target vehicle are a first sensing lane line and a second sensing lane line, respectively.
In an example, the apparatus 500 may further include a conversion module to convert the location coordinates of the target vehicle and the coordinates of the perceived lane line under a world coordinate system.
In an example, the apparatus 500 further includes a determining module configured to determine a first end and a second end along the perceived lane line direction, a distance between the second end and the target vehicle being greater than a distance between the first end and the target vehicle, all sampling points being between the first end and the second end.
In one example, both the first end and the second end are forward or rearward of the target vehicle.
Reference may be made to the foregoing embodiments for details of other operations performed by the modules in this embodiment, which are not further described herein.
According to the positioning accuracy evaluation device provided by the embodiment, lane lines on two sides of a target vehicle are firstly perceived and respectively a first perceived lane line and a second perceived lane line, then a plurality of sampling points are respectively selected on the first perceived lane line and the second perceived lane line under a world coordinate system, a first average value of Y-axis deviation of each sampling point on the first perceived lane line is calculated, a second average value of Y-axis deviation of each sampling point on the second perceived lane line is calculated, and finally, the average value of the first average value and the second average value is calculated to obtain a third average value, and the third average value is used as a first evaluation index of positioning accuracy.
Embodiment four: fig. 6 is a schematic structural diagram of a positioning accuracy evaluation device according to another embodiment of the present invention, and referring to fig. 6, a device 600 includes: the sensing module 501 is configured to sense lane lines on two sides of a target vehicle, and obtain a sensed lane line, which is a first sensed lane line and a second sensed lane line respectively; the sampling module 502 is configured to select a plurality of sampling points on the first perceived lane line and the second perceived lane line respectively in a world coordinate system; a first calculating module 503, configured to calculate a first average value of Y-axis deviations of the sampling points on the first perceived lane line; calculating a second average value of Y-axis deviations of all sampling points on the second perception lane line; the Y-axis deviation is the difference between Y-axis values of the corresponding map points on the map lane line when the sampling points on the sensing lane line are projected to the map, and the map lane line is the lane line corresponding to the sensing lane line on the map; a second calculating module 504, configured to calculate an average value of the first average value and the second average value, obtain a third average value, and use the third average value as a first evaluation index of the positioning accuracy; a third calculation module 601 for calculating a first difference value S of each marker at the current time 1 =S d -S o Wherein S is d S, sensing the area of the boundary frame for the marker obtained through a sensing algorithm o An area overlapping a bounding box of the marker on the map when projected onto the map; a fourth calculation module 602, configured to calculate a first difference S of each of the markers at the current time 1 Average value of (2) to obtain a second difference S 2 With the second difference S 2 As a fourth evaluation index of the positioning accuracy.
In an example, the apparatus 600 may further include a selection module for selecting the area S o Objects greater than the area threshold serve as markers.
In an example, the apparatus 600 may further include a second index optimization module configured to, if q second differences S are obtained in the statistical time 2 Then q second differences S are calculated 2 Obtaining a fifth evaluation index by the ratio of the sum to the statistical time and/or calculating q second difference values S 2 Obtaining a sixth evaluation index by the ratio of the sum to the statistical distance; wherein the statistical time is calculated by calculating a first and a second difference S from the beginning 2 To the end of calculating the last second difference S 2 When the method is used, the statistical distance is the distance change of the target vehicle after the statistical time.
In an example, the apparatus 600 may further include a fifth calculation module configured to set a seventh evaluation index, where the seventh evaluation index is a weighted average of the first evaluation index and the fourth evaluation index under different weights.
In one example, the first rating index has a greater weight than the fourth rating index.
Reference may be made to the foregoing embodiments for details of other operations performed by the modules in this embodiment, which are not further described herein.
The positioning accuracy evaluation device provided by the embodiment does not need to evaluate the positioning accuracy by means of a true value, does not need expensive true value equipment, is more convenient and quick, can evaluate not only the transverse deviation, but also the longitudinal deviation, and comprehensively evaluate the transverse deviation and the longitudinal deviation, and more comprehensively evaluate the positioning accuracy.
The positioning accuracy evaluation device in the embodiment of the application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The positioning accuracy evaluation device in the embodiment of the present application may be a device having an operating system. The operating system may be an android operating system, an iOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The application also provides an electronic device, comprising: a memory for storing programs or instructions executable by the processor; and a processor, configured to execute the program or instructions to implement each process of the positioning accuracy evaluation method embodiment, and achieve the same technical effects, so that repetition is avoided, and no further description is given here.
Fig. 7 is a schematic diagram of an electronic device according to an embodiment of the invention. The electronic device 700 may include an internal communication bus 701, a Processor (Processor) 702, a Read Only Memory (ROM) 703, a Random Access Memory (RAM) 704, and a communication port 705. Internal communication bus 701 may enable data communication between components of electronic device 700. The processor 702 may make the determination and issue a prompt. In some implementations, the processor 702 may be comprised of one or more processors. The communication port 705 may enable the electronic device 700 to communicate data with the outside. In some implementations, the electronic device 700 may send and receive information and data from a network through the communication port 705. The electronic device 700 may also include program storage elements in various forms as well as data storage elements, read Only Memory (ROM) 703 and Random Access Memory (RAM) 704 capable of storing various data files for computer processing and/or communication, and possibly programs or instructions for execution by the processor 702. The results processed by the processor 702 are communicated to the user device via the communication port 705 for display on a user interface.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored, where the program or the instruction realizes each process of the embodiment of the positioning accuracy evaluation method when executed by a processor, and the process can achieve the same technical effect, so that repetition is avoided, and no further description is given here.
The processor is a processor in the electronic device in the above embodiment. The readable storage medium includes a computer readable storage medium such as a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk.
The above disclosure is intended to be illustrative only and not limiting to the present application to those skilled in the art. Although not explicitly described herein, various modifications, improvements, and adaptations of the present application may occur to one skilled in the art. Such modifications, improvements, and modifications are intended to be suggested within this application, and are therefore within the spirit and scope of the exemplary embodiments of this application.
Likewise, it should be noted that in order to simplify the presentation disclosed herein and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure, however, is not intended to imply that more features than are presented in the claims are required for the subject application. Indeed, less than all of the features of a single embodiment disclosed above.
In some embodiments, numbers describing the components, number of attributes are used, it being understood that such numbers being used in the description of embodiments are modified in some examples by the modifier "about," approximately, "or" substantially. Unless otherwise indicated, "about," "approximately," or "substantially" indicate that the number allows for a 20% variation. Accordingly, in some embodiments, numerical parameters set forth in the specification and claims are approximations that may vary depending upon the desired properties sought to be obtained by the individual embodiments. In some embodiments, the numerical parameters should take into account the specified significant digits and employ a method for preserving the general number of digits. Although the numerical ranges and parameters set forth herein are approximations that may be employed in some embodiments to confirm the breadth of the range, in particular embodiments, the setting of such numerical values is as precise as possible.
While the present application has been described with reference to the present specific embodiments, those of ordinary skill in the art will recognize that the above embodiments are for illustrative purposes only, and that various equivalent changes or substitutions can be made without departing from the spirit of the present application, and therefore, all changes and modifications to the embodiments described above are intended to be within the scope of the claims of the present application.

Claims (15)

1. The positioning accuracy evaluation method is characterized by comprising the following steps:
sensing lane lines on two sides of a target vehicle to obtain a sensing lane line, wherein the sensing lane line is a first sensing lane line and a second sensing lane line respectively;
under a world coordinate system, respectively selecting a plurality of sampling points on the first sensing lane line and the second sensing lane line;
calculating a first average value of Y-axis deviations of all sampling points on the first perception lane line; calculating a second average value of Y-axis deviations of all sampling points on the second perception lane line; the Y-axis deviation is the difference between Y-axis values of the corresponding map points on the map lane line when the sampling points on the sensing lane line are projected to the map, and the map lane line is the lane line corresponding to the sensing lane line on the map;
and calculating the average value of the first average value and the second average value to obtain a third average value, and taking the third average value as a first evaluation index of the positioning accuracy.
2. The positioning accuracy evaluation method according to claim 1, further comprising:
if p third average values are obtained in the statistical time, calculating the ratio of the sum of the p third average values to the statistical time to obtain a second evaluation index, and/or calculating the ratio of the sum of the p third average values to the statistical distance to obtain a third evaluation index;
Wherein the statistical time is from the beginning of calculating the first third average value to the ending of calculating the last third average value, and the statistical distance is the distance change of the target vehicle after the statistical time.
3. The positioning accuracy evaluation method according to claim 1, wherein in the lane lines on both sides of the target vehicle, lane lines on both sides of the target vehicle closest to the target vehicle are the first perceived lane line and the second perceived lane line, respectively.
4. The positioning accuracy evaluation method according to claim 1, further comprising: and converting the positioning coordinates of the target vehicle and the coordinates of the sensing lane line under a world coordinate system.
5. The positioning accuracy evaluation method according to claim 1, further comprising: and determining a first end and a second end along the direction of the sensing lane line, wherein the distance between the second end and the target vehicle is larger than the distance between the first end and the target vehicle, and all sampling points are between the first end and the second end.
6. The positioning accuracy evaluation method according to claim 5, wherein the first end and the second end are both in front of or behind the target vehicle.
7. The positioning accuracy evaluation method according to claim 1, further comprising:
at the current time, a first difference S of each marker is calculated 1 =S d -S o Wherein S is d S, sensing the area of the boundary frame for the marker obtained through a sensing algorithm o An area overlapping a bounding box of the marker on the map when projected onto the map;
calculating a first difference S of each marker at the current moment 1 Average value of (2) to obtain a second difference S 2 With the second difference S 2 As a fourth evaluation index of the positioning accuracy.
8. The positioning accuracy evaluation method according to claim 7, further comprising: select area S o Greater thanAn object of an area threshold serves as the marker.
9. The positioning accuracy evaluation method according to claim 7, further comprising:
if q second differences S are obtained in the statistical time 2 Then calculate the q second differences S 2 Obtaining a fifth evaluation index by the ratio of the sum to the statistical time and/or calculating the q second difference values S 2 Obtaining a sixth evaluation index by the ratio of the sum to the statistical distance;
wherein the statistical time is calculated by calculating a first and a second difference S from the beginning 2 To the end of calculating the last second difference S 2 And when the statistical distance is used, the statistical distance is the distance change of the target vehicle after the statistical time.
10. The positioning accuracy evaluation method according to claim 7, further comprising: setting a seventh evaluation index which is a weighted average of the first evaluation index and the fourth evaluation index under different weights.
11. The positioning accuracy evaluation method according to claim 10, wherein the weight of the first evaluation index is larger than the weight of the fourth evaluation index.
12. A positioning accuracy evaluation device, comprising:
the sensing module is used for sensing lane lines on two sides of the target vehicle to obtain sensing lane lines, namely a first sensing lane line and a second sensing lane line;
the sampling module is used for respectively selecting a plurality of sampling points on the first sensing lane line and the second sensing lane line under a world coordinate system;
the first calculation module is used for calculating a first average value of Y-axis deviation of each sampling point on the first perception lane line; calculating a second average value of Y-axis deviations of all sampling points on the second perception lane line; the Y-axis deviation is the difference between Y-axis values of the corresponding map points on the map lane line when the sampling points on the sensing lane line are projected to the map, and the map lane line is the lane line corresponding to the sensing lane line on the map;
And the second calculation module is used for calculating the average value of the first average value and the second average value to obtain a third average value, and the third average value is used as a first evaluation index of the positioning accuracy.
13. The positioning accuracy evaluation device according to claim 12, further comprising:
a third calculation module for calculating a first difference S of each marker at the current moment 1 =S d -S o Wherein S is d S, sensing the area of the boundary frame for the marker obtained through a sensing algorithm o An area overlapping a bounding box of the marker on the map when projected onto the map;
a fourth calculation module for calculating a first difference value S of each marker at the current moment 1 Average value of (2) to obtain a second difference S 2 With the second difference S 2 As a fourth evaluation index of the positioning accuracy.
14. An electronic device, comprising: a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the positioning accuracy evaluation method according to any one of claims 1-11.
15. A readable storage medium, wherein a program or instructions is stored on the readable storage medium, which when executed by a processor, implements the steps of the positioning accuracy evaluation method according to any one of claims 1 to 11.
CN202311811573.8A 2023-12-27 2023-12-27 Positioning accuracy evaluation method and device and electronic equipment Active CN117490727B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311811573.8A CN117490727B (en) 2023-12-27 2023-12-27 Positioning accuracy evaluation method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311811573.8A CN117490727B (en) 2023-12-27 2023-12-27 Positioning accuracy evaluation method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN117490727A true CN117490727A (en) 2024-02-02
CN117490727B CN117490727B (en) 2024-03-29

Family

ID=89685243

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311811573.8A Active CN117490727B (en) 2023-12-27 2023-12-27 Positioning accuracy evaluation method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN117490727B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009259215A (en) * 2008-03-18 2009-11-05 Zenrin Co Ltd Road surface marking map generation method
US20100088164A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations with respect to facilities maps
CN110542436A (en) * 2019-09-11 2019-12-06 百度在线网络技术(北京)有限公司 Evaluation method, device and equipment of vehicle positioning system and storage medium
CN112347205A (en) * 2019-08-06 2021-02-09 北京初速度科技有限公司 Method and device for updating error state of vehicle
US20210164800A1 (en) * 2019-11-29 2021-06-03 Aptiv Technologies Limited Method for Determining the Position of a Vehicle
CN114022860A (en) * 2020-07-16 2022-02-08 长沙智能驾驶研究院有限公司 Target detection method and device and electronic equipment
CN114252082A (en) * 2022-03-01 2022-03-29 苏州挚途科技有限公司 Vehicle positioning method and device and electronic equipment
CN114646317A (en) * 2022-03-17 2022-06-21 长沙慧联智能科技有限公司 Vehicle visual positioning navigation control method and device, computer equipment and medium
CN115143996A (en) * 2022-09-05 2022-10-04 北京智行者科技股份有限公司 Positioning information correction method, electronic device, and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009259215A (en) * 2008-03-18 2009-11-05 Zenrin Co Ltd Road surface marking map generation method
US20100088164A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations with respect to facilities maps
CN112347205A (en) * 2019-08-06 2021-02-09 北京初速度科技有限公司 Method and device for updating error state of vehicle
CN110542436A (en) * 2019-09-11 2019-12-06 百度在线网络技术(北京)有限公司 Evaluation method, device and equipment of vehicle positioning system and storage medium
US20210164800A1 (en) * 2019-11-29 2021-06-03 Aptiv Technologies Limited Method for Determining the Position of a Vehicle
CN114022860A (en) * 2020-07-16 2022-02-08 长沙智能驾驶研究院有限公司 Target detection method and device and electronic equipment
CN114252082A (en) * 2022-03-01 2022-03-29 苏州挚途科技有限公司 Vehicle positioning method and device and electronic equipment
CN114646317A (en) * 2022-03-17 2022-06-21 长沙慧联智能科技有限公司 Vehicle visual positioning navigation control method and device, computer equipment and medium
CN115143996A (en) * 2022-09-05 2022-10-04 北京智行者科技股份有限公司 Positioning information correction method, electronic device, and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
SHUAIZHI FU; WEI CHEN; HAIYANG KONG;等: "Lane-Level Map-Matching at Multilane Intersections Based on Network RTK and Internet of Vehicle", 2019 5TH INTERNATIONAL CONFERENCE ON TRANSPORTATION INFORMATION AND SAFETY, 28 October 2019 (2019-10-28) *
毕雁冰;: "基于多边检测的一种车道偏离评价方法", 深圳信息职业技术学院学报, vol. 5, no. 01, 31 March 2007 (2007-03-31) *
程君,张立炎,陈启宏: "一种基于地图辅助的自动驾驶视-惯融合定位方法", 交通运输系统工程与信息, vol. 22, no. 2, 30 April 2022 (2022-04-30) *

Also Published As

Publication number Publication date
CN117490727B (en) 2024-03-29

Similar Documents

Publication Publication Date Title
EP3699048A1 (en) Travelling track prediction method and device for vehicle
Vivacqua et al. Self-localization based on visual lane marking maps: An accurate low-cost approach for autonomous driving
CN102208036B (en) Vehicle position detection system
CN110530372B (en) Positioning method, path determining device, robot and storage medium
CN110426051A (en) A kind of lane line method for drafting, device and storage medium
JP4370869B2 (en) Map data updating method and map data updating apparatus
CN109410301A (en) High-precision semanteme map production method towards pilotless automobile
CN109931939A (en) Localization method, device, equipment and the computer readable storage medium of vehicle
WO2018227980A1 (en) Camera sensor based lane line map construction method and construction system
JP6161942B2 (en) Curve shape modeling device, vehicle information processing system, curve shape modeling method, and curve shape modeling program
CN110345951B (en) ADAS high-precision map generation method and device
CN111220143B (en) Method and device for determining position and posture of imaging equipment
US11731649B2 (en) High precision position estimation method through road shape classification-based map matching and autonomous vehicle thereof
CN109515439A (en) Automatic Pilot control method, device, system and storage medium
CN111427373A (en) Pose determination method, device, medium and equipment
JP2009099125A (en) Image recognition device, image recognition program, and point information collection device and navigation device using them
CN113566817B (en) Vehicle positioning method and device
JP2023164553A (en) Position estimation device, estimation device, control method, program and storage medium
CN113706633B (en) Three-dimensional information determination method and device for target object
JP2018105636A (en) Route generation device
CN114509065A (en) Map construction method, map construction system, vehicle terminal, server side and storage medium
CN117490727B (en) Positioning accuracy evaluation method and device and electronic equipment
CN112651991A (en) Visual positioning method, device and computer system
WO2019188704A1 (en) Self-position estimation device, self-position estimation method, program, and recording medium
CN112837365A (en) Image-based vehicle positioning method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant