CN115143996B - Positioning information correction method, electronic device, and storage medium - Google Patents

Positioning information correction method, electronic device, and storage medium Download PDF

Info

Publication number
CN115143996B
CN115143996B CN202211080002.7A CN202211080002A CN115143996B CN 115143996 B CN115143996 B CN 115143996B CN 202211080002 A CN202211080002 A CN 202211080002A CN 115143996 B CN115143996 B CN 115143996B
Authority
CN
China
Prior art keywords
lane
coordinate system
lane line
positioning information
precision
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211080002.7A
Other languages
Chinese (zh)
Other versions
CN115143996A (en
Inventor
徐宁
曹菊宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Idriverplus Technologies Co Ltd
Original Assignee
Beijing Idriverplus Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Idriverplus Technologies Co Ltd filed Critical Beijing Idriverplus Technologies Co Ltd
Priority to CN202211080002.7A priority Critical patent/CN115143996B/en
Publication of CN115143996A publication Critical patent/CN115143996A/en
Application granted granted Critical
Publication of CN115143996B publication Critical patent/CN115143996B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • G01C21/3815Road data
    • G01C21/3819Road shape data, e.g. outline of a route
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road

Abstract

The invention discloses a positioning information correction method, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring a visual lane line detection result of a road where a current frame vehicle is located under a vehicle body coordinate system and a high-precision map lane line under a UTM coordinate system; pairing the visual lane line and the high-precision lane line to obtain at least one lane line group; converting the high-precision lane lines in the group of lane line groups into a lane point set under a vehicle body coordinate system to obtain a lane point set under the vehicle body coordinate system, and calculating the deviation cost value of the high-precision lane lines and the paired visual lane lines according to the lane point set and the paired visual lane lines; calculating a correction error based on the deviation cost value; and correcting the positioning information according to the correction error. According to the embodiment of the invention, the correction error value is calculated by acquiring the visual lane line information and the high-precision map lane line information, and the correction error value is corrected to the positioning information of the self-vehicle, so that the accuracy and the stability of the self-vehicle positioning system are improved, and the self-vehicle positioning system is not easily interfered by scenes.

Description

Positioning information correction method, electronic device, and storage medium
Technical Field
The present invention relates to the field of automatic driving technologies, and in particular, to a positioning information correction method, an electronic device, and a storage medium.
Background
With the development of artificial intelligence technology, the automatic driving technology is more mature. The automatic driving technology can be briefly divided into perception, prediction, positioning, decision, planning and control. Positioning is one of the key parts of automatic driving, and the coordinate system applied in automatic driving mainly comprises: a world coordinate system (for example, wgs-84 coordinate system), a UTM (UNIVERSAL transfer business GRID SYSTEM, UNIVERSAL cross hatch grid system) coordinate system, a DR (DEAD reckong) coordinate system, a vehicle body coordinate system, each sensor coordinate system related to the sensing information, and the like, which are related to the positioning information.
After being processed by the positioning module, the positioning information is often expressed by a UTM coordinate system, a DR coordinate system and a vehicle body coordinate system, and after being processed by the sensing module, the sensing information can be converted into description under the vehicle body coordinate system. The coordinate origin of the vehicle coordinate system changes along with the vehicle in real time, so that the vehicle coordinate system is unrelated in time dimension; the UTM coordinate system is fixed in origin of coordinates and does not change along with time dimension, so that the UTM coordinate system can be used for processing data on the time dimension, but the sensing information described in the UTM coordinate system and the vehicle body coordinate system needs to be subjected to coordinate conversion through the UTM positioning information of the vehicle, the conversion process can introduce the positioning precision error of the vehicle, and the UTM coordinate system is easily influenced by a scene and has the risk of positioning failure; the DR information is from an IMU (Inertial Measurement Unit) sensor and a milemeter, is less influenced by a scene, has high precision in a short time, but is limited by the precision of the sensor, and has larger accumulated error in a long time.
In the related technology, a combination mode of a UTM coordinate system and a vehicle body coordinate system is often adopted, perception information is described based on the vehicle body coordinate system, positioning information and high-precision map information are described based on the UTM coordinate system, and the positioning information and the high-precision map information are converted through self vehicle UTM positioning information.
The inventor finds that: positioning accuracy errors of the vehicle can be introduced in the coordinate conversion process, so that the description of perception information in a UTM coordinate system introduces the positioning errors, the perception accuracy is reduced, the scene influence is easy to occur, and the risk of positioning failure exists; the DR information is from IMU sensor and odometer, and is less influenced by the scene, high in precision in short time, but limited by sensor precision, and large in accumulated error in long time.
Disclosure of Invention
The embodiment of the invention aims to solve at least one of the technical problems.
In a first aspect, an embodiment of the present invention provides a method for correcting positioning information, including: acquiring a visual lane line detection result under a vehicle body coordinate system of a road where a current frame vehicle is located and a high-precision map lane line under a universal transverse ink truck supporting grid coordinate system (UTM) coordinate system, wherein the visual lane line detection result comprises at least one visual lane line, and the high-precision map lane line comprises at least one high-precision lane line; pairing the visual lane lines and the high-precision lane lines to obtain at least one lane line group; for each lane line group, converting the high-precision lane lines in the lane line group into a lane point set under a vehicle body coordinate system to obtain a lane point set under the vehicle body coordinate system, and calculating the deviation cost value of the high-precision lane lines and paired visual lane lines according to the lane point set and the paired visual lane lines; calculating a correction error based on the deviation cost value of each high-precision lane line and the paired visual lane lines; and correcting the positioning information according to the correction error.
In a second aspect, an embodiment of the present invention provides a positioning information correction apparatus, including: the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a visual lane line detection result under a vehicle body coordinate system of a road where a current frame of a self vehicle is located and a high-precision map lane line under a universal transverse ink card grid supporting coordinate system (UTM) coordinate system, the visual lane line detection result comprises at least one visual lane line, and the high-precision map lane line comprises at least one high-precision lane line; the pairing module is used for pairing the visual lane line and the high-precision lane line to obtain at least one lane line group; the first calculation module is used for converting the high-precision lane lines in each lane line group into a lane point set under a vehicle body coordinate system to obtain a lane point set under the vehicle body coordinate system, and calculating the deviation cost value of the high-precision lane lines and the paired visual lane lines according to the lane point set and the paired visual lane lines; the second calculation module is used for calculating a correction error based on the deviation cost value of each high-precision lane line and the paired visual lane lines; and the correcting module is used for correcting the positioning information according to the correcting error.
In a third aspect, an embodiment of the present invention provides an electronic device, including: at least one processor, and a memory communicatively coupled to the at least one processor, wherein the memory stores instructions executable by the at least one processor, the instructions being executable by the at least one processor to enable the at least one processor to perform the steps of the positioning information correction method of any of the embodiments of the present invention.
In a fourth aspect, an embodiment of the present invention provides a storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the steps of the positioning information correction method according to any embodiment of the present invention.
In a fifth aspect, the present invention further provides a computer program product, where the computer program product includes a computer program stored on a non-volatile computer-readable storage medium, where the computer program includes program instructions, and when the program instructions are executed by a computer, the computer executes the steps of the positioning information correction method according to any embodiment of the present invention.
In a sixth aspect, an embodiment of the present invention further provides a mobile tool with a vision sensor mounted thereon, where the mobile tool includes the electronic device described in the third aspect, and the vision sensor is in communication connection with the electronic device.
According to the method and the device, the accumulated error of the coordinates of the self-vehicle is corrected through the visual lane line information and the high-precision map lane line information, the accuracy and the stability of the positioning system of the self-vehicle are improved, the positioning system after correction can be used by other modules of the self-vehicle, the scene interference is not easy to cause, and all modules of the self-vehicle can be accurately and stably used in the process of calling the positioning system.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart of a method for correcting positioning information according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a high-precision map lane line numbering rule provided in the embodiment of the present invention;
FIG. 3 is a schematic diagram of three coordinate systems in an embodiment of the present invention;
FIG. 4 is a diagram illustrating time synchronization of various information according to an embodiment of the present invention;
fig. 5 is a schematic view of a flow structure of positioning information correction according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a positioning information correction apparatus according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As will be appreciated by one skilled in the art, embodiments of the present application may be embodied as a system, apparatus, device, method, or computer program product. Accordingly, the present disclosure may be embodied in the form of: entirely hardware, entirely software (including firmware, resident software, micro-code, etc.), or a combination of hardware and software.
For convenience of understanding, technical terms related to the present application are explained as follows: the term "mobile device" as used herein includes, but is not limited to, vehicles of six automatic driving skill classes, L0-L5, established by Society of automatic Engineers International (SAE International) or national standard "hierarchy of automation for driving automobiles".
In some embodiments, the mobile device may be a vehicle apparatus or a robot apparatus having various functions as follows:
(1) Manned functions, such as home cars, buses, and the like;
(2) Cargo carrying functions such as common trucks, van trucks, dump trucks, enclosed trucks, tank trucks, flat trucks, container vans, dump trucks, special structure trucks and the like;
(3) Tool functions, such as logistics distribution vehicles, automated Guided Vehicles (AGVs), patrol cars, cranes, excavators, bulldozers, forklifts, road rollers, loaders, off-road vehicles, armored vehicles, sewage treatment vehicles, sanitation vehicles, vacuum cleaners, ground cleaning vehicles, watering lorries, sweeping robots, dining delivery robots, shopping guide robots, lawn mowers, golf carts, etc.;
(4) Entertainment functions, such as recreational vehicles, casino automatic drives, balance cars, and the like;
(5) Special rescue functions such as fire trucks, ambulances, electrical power breakdown trucks, engineering emergency trucks and the like.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
The invention may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
As used in this disclosure, "module," "device," "system," and the like are intended to refer to a computer-related entity, either hardware, a combination of hardware and software, or software in execution. In particular, for example, an element may be, but is not limited to being, a process running on a processor, an object, an executable, a thread of execution, a program, and/or a computer. Also, an application or script running on a server, or a server, may be an element. One or more elements may be in a process and/or thread of execution and an element may be localized on one computer and/or distributed between two or more computers and may be operated by various computer-readable media. The elements may also communicate by way of local and/or remote processes based on a signal having one or more data packets, e.g., from a data packet interacting with another element in a local system, distributed system, and/or across a network in the internet with other systems by way of the signal.
Finally, it should also be noted that, in this document, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Furthermore, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of another like element in a process, method, article, or apparatus that comprises the element.
Example one
Fig. 1 is a flowchart of a positioning information correction method according to an embodiment of the present invention, which includes the following steps:
s11: obtaining a visual lane line detection result of a road where a current frame self vehicle is located in a vehicle body coordinate system and a high-precision map lane line in a UTM coordinate system, wherein the visual lane line detection result comprises at least one visual lane line, and the high-precision map lane line comprises at least one high-precision lane line;
s12: pairing the visual lane line and the high-precision lane line to obtain at least one lane line group;
s13: for each lane line group, converting the high-precision lane lines in the lane line group into a lane point set under a vehicle body coordinate system to obtain a lane point set under the vehicle body coordinate system, and calculating the deviation cost value of the high-precision lane lines and paired visual lane lines according to the lane point set and the paired visual lane lines;
s14: calculating a correction error based on the deviation cost value of each high-precision lane line and the paired visual lane lines;
s15: and correcting the positioning information according to the correction error.
In the embodiment, the correction error value is calculated based on the known visual lane line information and the high-precision map lane line information, and the correction error value is corrected to the positioning information of the self-vehicle, so that the accuracy and the stability of the self-vehicle positioning system are improved, the self-vehicle positioning system is not easily interfered by scenes, and other modules can call the accurate and stable positioning information.
In step S11, positioning data of a current vehicle positioning system is obtained, a current road on which the vehicle is located is determined according to the positioning data, and a visual lane detection result in a vehicle body coordinate system is obtained from the lane detection module, where the visual lane detection result includes at least one visual lane, and the visual lane is described in the vehicle body coordinate system. And acquiring the high-precision map lane line of the current road of the vehicle from the high-precision map database based on the positioning data. In the embodiment of the invention, the high-precision map lane line is described in a UTM coordinate system.
And step S12, pairing the acquired visual lane lines and the high-precision lane lines to obtain at least one lane line group.
In the embodiment of the invention, the positioning information comprises original DR positioning information in a DR coordinate system and original UTM positioning information in a UTM coordinate system.
In an embodiment, in step S12, the pairing of the visual lane line and the high-precision lane line to obtain at least one lane line group specifically includes: preliminarily pairing the visual lane lines and the high-precision lane lines according to the numbering rules of the visual lane lines and the numbering rules of the high-precision lane lines to obtain at least one initial lane line group; calculating the difference degree between the high-precision lane line and the visual lane line in each initial lane line group; if the difference degree is larger than a preset threshold value, deleting the initial lane group; and if the difference degree is not greater than the threshold value, determining the initial lane group as the lane group.
As shown in fig. 2, which is a high-precision lane line numbering rule, for a high-precision lane line, the id of the left lane line Lh is "-2", and the id of the right lane line Rh is "-3"; in the VSN, the view angle of the vehicle is taken as a reference, the id of a left lane line Lvsn is '-1', and the id of a right lane line Rvsn is '-2'; therefore, according to the lane line numbering rules of the two, 2 initial lane line groups [ Lh, lvsn ], [ Rh, rvsn ] can be obtained.
Preferably, in order to further confirm the accuracy of the initial lane groups, the embodiment of the present invention further calculates the difference between the high-precision lane line and the visual lane line in each initial lane group, and deletes the initial lane group if the difference is large and the pairing error is confirmed; if the difference degree is smaller, the initial lane line group is reserved and used as the final lane line group if the matching is confirmed to be accurate.
Calculating the difference degree between the high-precision lane line and the visual lane line in the initial lane line group, which can be specifically calculated according to the following method: and converting the high-precision lane lines into a vehicle body coordinate system to obtain a lane point set in the vehicle body coordinate system, and calculating the difference between the high-precision lane lines and the paired visual lane lines according to the lane point set and the paired visual lane lines.
In a specific embodiment, calculating the difference between the high-precision lane line and the paired visual lane lines according to the lane point set and the paired visual lane lines specifically includes: and calculating the minimum distance from each point in the lane point set to the paired visual lane lines, and determining the difference between the high-precision lane line and the paired visual lane lines based on the minimum distance from each point to the visual lane lines.
In some embodiments, the high-precision lane lines are converted into a host vehicle body coordinate system to obtain a set of lane points in the host vehicle body coordinate system. Firstly, defining a lane line detection result VSN, wherein the VSN comprises a timestamp TVSN and n lane line equations Ln, and the lane line equations are expressed by a vehicle body coordinate system. Defining O _ DR as original DR positioning information under DR coordinate system, wherein O _ DR comprises time stamp information T O_DR Position information P O_DR (x dr , y dr , z dr ) And angle information R O_DR (rx dr , ry dr , rz dr ) O _ DR is high frequency data; defining C _ DR as virtual DR positioning information under DR coordinate system, wherein C _ DR comprises time stamp information T C_DR And positional information P C_DR (x cdr , y cdr , z cdr ) And angle information R C_DR (rx cdr , ry cdr , rz cdr ) C _ DR is low-frequency data time-synchronized with the visual lane line detection result; defining the position information of any point under a vehicle body coordinate system to be represented by P (x, y, z), and representing the angle information by R (rx, ry, rz); defining O _ G to represent original positioning information under a UTM coordinate system, and C _ G to represent corrected positioning information under the UTM coordinate system; p for defining position information of any point under UTM coordinate system G (x G ,y G ,z G ) Indicating, angle information is represented by R G (rx G ,ry G ,rz G ) Indicating, based on the definition, the position P in the UTM coordinate system by the vehicle g_car (x g0 , y g0 , z g0 ) And angle information R g_car (rx g0 , ry g0 , rz g0 ). As shown in fig. 3, the vehicle body coordinate system, the UTM coordinate system, and the DR coordinate system are illustrated from left to right, respectively.
The vehicle body coordinate system and the UTM coordinate system have a conversion relation shown in formula (1):
Figure 19629DEST_PATH_IMAGE001
in the formula (1), R DR For a transformation matrix from the DR coordinate system to the body coordinate system, R x For the yz in-plane transformation matrix, R y For an in-xz-plane transformation matrix, R z Is an xy in-plane transformation matrix.
Position P under DR coordinate system based on self vehicle car (x 0 , y 0 , z 0 ) And angle information R car (rx 0 , ry 0 , rz 0 ) The vehicle body coordinate system and the DR coordinate system are converted and related as shown in the formula (2):
Figure 888359DEST_PATH_IMAGE002
in the formula (2), R G For transformation matrix from UTM coordinate system to vehicle body coordinate system, R x Is a yz in-plane switching matrix, R y Is an xz in-plane switching matrix, R z Is an xy in-plane conversion matrix; when the bicycle body is runningIn the process, a low-frequency visual lane detection result VSN under time synchronization and a lane line with the same ID in a high-precision map of a self-vehicle need to be selected, and a deviation cost value Fline _ n is calculated, wherein a lower corner mark n represents a lane ID, for example, the lane deviation cost value of "-1" is Fline _ 1, and the calculation process is as follows:
positioning information C _ DR according to the last frame virtual DR T-1 And O _ DR variable quantity time-synchronous with the visual lane line detection result to obtain the virtual C _ DR of the current frame T :
C_DR T = C_DR T-1 + (O_DR T - O_DR T-1 )R ODR R CDR -1 (ii) a Formula (3)
In the formula (3), R ODR 、R CDR Conversion matrix R of DR coordinate system and vehicle body coordinate system DR With the difference that R ODR The parameters used in the formula are related to O _ DR, R CDR The parameter used in (1) is related to C _ DR, R CDR -1 Is R CDR Inverse matrix of, O _ DR T Original DR positioning information, O _ DR, for current frame time synchronization T-1 Original DR positioning information time-synchronized for the previous frame.
The visual lane line acquired in the embodiment of the invention is time-synchronized with the acquired positioning information, and the time synchronization can be specifically realized as follows:
the time synchronization is as follows, and the visual lane line detection results VSN and O _ DR under the time synchronization are acquired according to the time synchronization principle, as shown in fig. 4.
Calculating to obtain the virtual C _ G of the current frame according to the following formula (4) T
C_G T = C_G T-1 + (O_DR T - O_DR T-1 )R ODR R G -1 (ii) a Formula (4)
In the formula (4), C _ G T Virtual UTM positioning information for current frame, C _ G T-1 UTM positioning information, R, time-synchronized for previous frame ODR Is a transformation matrix of a DR coordinate system and a vehicle body coordinate system, R G -1 Is R G The inverse matrix of (c).
In the aforementioned step S12, based on C _ G T Converting the high-precision lane lines into a vehicle body coordinate system to obtain a lane point set under the vehicle body coordinate system, calculating the difference between the high-precision lane lines and the paired visual lane lines according to the lane point set and the paired visual lane lines, namely firstly calculating the minimum distance between each point in the lane point set and the paired visual lane lines, and determining the difference between the high-precision lane lines and the paired visual lane lines according to the minimum distance. For example, can be based on
Figure 452808DEST_PATH_IMAGE003
And calculating the difference between the high-precision lane line and the paired visual lane lines, wherein Sm represents the minimum distance from the m & ltth & gt point in the lane point set to the paired visual lane lines, and Pm is the weight value of the m & ltth & gt point.
The foregoing steps S13 and S14 can be specifically realized by the following scheme.
In the foregoing step S13, the high-precision lane lines in each group of lane lines are converted into the vehicle body coordinate system in the same manner as in the foregoing step S12 to obtain the lane point set in the vehicle body coordinate system, which is not described herein again. Calculating the deviation cost value of the high-precision lane line and the paired visual lane lines according to the lane point set and the paired visual lane lines, and concretely realizing the following steps: acquiring a plurality of preset discrete error values; for each discrete error value, overlapping the discrete error value with each point in the lane point set to obtain a sampling point set; and calculating the minimum distance from each point in the sampling point set to the paired visual lane lines, and determining the deviation cost value of the high-precision lane lines and the paired visual lane lines based on the discrete error value according to the minimum distance.
For example, a value range of the error value is determined based on the accuracy of the vehicle-mounted IMU and the vehicle-mounted odometer, for example, the maximum value of the accuracy of the vehicle-mounted IMU and the vehicle-mounted odometer is taken as the maximum value max of the value range of the error value, 0 to max is taken as the value range of the error value, N discrete error values are discretely acquired in the value range, and the ith discrete error value is represented by ui.
For each set of lane line groups, headFirstly, converting the high-precision lane lines in the group to a vehicle body coordinate system according to the step S13 to obtain a lane point set represented by Pline _ n { Pline (x 0, y0, rz 0) n, pline (x 1, y1, rz 1) n.. Pline (xm, ym, rzm) n }; secondly, sequentially and respectively superposing the lane point set Pline _ N with each discrete error to obtain N groups of sampling point sets, and taking superposition of Pline _ N and ui as an example to obtain a group of sampling point sets uiPline _ N { Pline (x 0, y0, rz 0) + ui, pline (x 1, y1, rz 1) + ui.. Pline (xm, ym, rzm) + ui }; aiming at each group of sampling point sets, calculating the minimum distance value from each point in the sampling point sets to the paired visual lane lines, calculating the deviation cost value of the high-precision lane lines and the paired visual lane lines based on the discrete error value based on the minimum distance value, taking uiPline _ n as an example, solving the minimum distance sm from each point in the uiPline _ n to the corresponding lane line equation Ln in the VSN, and obtaining the deviation cost value of
Figure 393082DEST_PATH_IMAGE004
(ii) a Thus obtaining a deviation cost set UF line_n {u 0 F line_n , u 1 F line_n ... u N F line_n }。
For step S14, calculating a correction error according to the deviation cost value of each high-precision lane line and the paired visual lane line in step S13, calculating a sampling weight of the discrete error, and performing a weighted calculation according to an inverse relationship between the sampling weight of the discrete error and the deviation cost value to obtain the correction error, wherein the sampling weight of the discrete error is calculated by formula (5):
Figure 260675DEST_PATH_IMAGE005
formula (5)
In the formula (5), w i The weighted value is ui, and L is the pairing number of VSN visual lane lines with high precision; n is the number of discrete error values.
The correction error is calculated according to the following equation (6):
Figure 565886DEST_PATH_IMAGE006
formula (6)
In equation (6), uc represents the correction error, and wi represents the weight of ui.
For step S15, the specific implementation of correcting the positioning information according to the correction error obtained by the weighted calculation may be as follows:
calculating to obtain the current frame virtual DR positioning information according to the previous frame virtual DR positioning information, the previous frame time synchronization original DR positioning information, the current frame time synchronization original DR positioning information, a conversion matrix between a DR coordinate system and a vehicle body coordinate system and the correction error, and referring to the following formula (7);
calculating to obtain current-time DR positioning information according to the current-time virtual DR positioning information, the current-time original DR positioning information, the previous-time original DR positioning information and a conversion matrix between a DR coordinate system and a vehicle body coordinate system, and referring to the following formula (9);
and calculating to obtain the current frame UTM positioning information according to the previous frame UTM positioning information, the previous frame time synchronization original DR positioning information, the current frame time synchronization original DR positioning information, the conversion matrix between the DR coordinate system and the vehicle body coordinate system, the conversion matrix between the vehicle body coordinate system and the UTM coordinate system and the correction error, and referring to the following formula (8).
Figure 492997DEST_PATH_IMAGE007
In formulas (7) to (8), C _ DR T Virtual DR positioning information for the current frame, C _ DR T-1 Locating information, O _ DR, for a previous frame virtual DR T-1 Raw DR positioning information, O _ DR, time-synchronized for previous frame T Original DR positioning information, R, for current frame time synchronization ODR Is a transformation matrix between the DR coordinate system and the vehicle body coordinate system, the correction error, R CDR Is a transformation matrix between a vehicle body coordinate system and a UTM coordinate system, C _ G T-1 For UTM positioning information of a previous frame, R G Is a transformation matrix between a vehicle body coordinate system and a UTM coordinate system, C _ G T Is as followsThe previous frame UTM location information.
Figure 581170DEST_PATH_IMAGE008
In the formula (9), C _ DR T Positioning information for virtual DR of current frame, CO _ DR t+1 For DR positioning information of current time, O _ DR t For the original DR positioning information at the previous time, O _ DR t+1 Original DR positioning information of the current moment; r ODR And R CDR Are conversion matrix R between R coordinate system and vehicle body coordinate system DR。
In the embodiment of the invention, the output frequency of the visual lane line detection result is low frequency, for example, 10Hz/s, namely, one visual lane line detection result is output every 0.1 second; the output frequency of the raw DR positioning information in the DR coordinate system is high frequency, e.g., 100Hz/s, i.e., one output every 0.01 seconds. Taking the output frequency of the visual lane line detection result as a reference, when calculating the low-frequency C _ DR, the original DR positioning information time-synchronized with the visual lane line detection result needs to be acquired to participate in the calculation. Supposing that the time T is taken as the current frame time of the visual lane detection result, searching original DR positioning information with the time T and the time T-1 from original DR positioning information in a DR coordinate system, and participating the original DR positioning information with the time T and the time T-1 in the calculation of the formula (7) and the formula (8) to obtain C _ DR of the time T T And C _ G T (ii) a Before receiving the next frame (i.e. T +1 frame) of the visual lane detection result, a large amount of high-frequency raw DR positioning information is also received, and the raw DR positioning information is calculated by formula (9).
In the embodiment, whether the correction error is calculated according to the current lane line is judged by setting the preset deviation cost threshold, and if the deviation is greater than the calculated deviation cost value, the lane is indicated to have a problem, and subsequent processing is not performed, so that the accuracy of correcting the error is improved.
The overall method structure of the technical scheme of the invention is shown in figure 5. Firstly, acquiring original positioning information, wherein the original positioning information comprises original positioning information O _ DR (namely A high-frequency DR original information) in a DR coordinate system and original positioning information O _ G (namely B UTM original information) in a UTM coordinate system, and supposing that the O _ G original positioning is effective; then, according to a visual lane line detection result VSN under time synchronization and a lane line in a high-precision map, correcting O _ DR to obtain virtual positioning information C _ DR (low-frequency data), and correcting O _ G to obtain C _ G (namely B UTM correction information); and finally, according to the virtual positioning information C _ DR and O _ DR, obtaining CO _ DR (A high-frequency DR correction information), and then sending the corrected high-frequency DR correction information and UTM correction information as the current positioning information to other modules in the automatic driving system for use.
Example two
As shown in fig. 6, which is a schematic structural diagram of a positioning information correction device according to an embodiment of the present invention, the system can execute the positioning information correction method according to any of the above embodiments, and is configured in a terminal.
The present embodiment provides a positioning information correction apparatus 100 including: the system comprises an acquisition module 110, a pairing module 120, a first calculation module 130, a second calculation module 140, and a correction module 150.
The acquiring module 110 is configured to acquire a visual lane detection result of a road where a current vehicle is located in a vehicle body coordinate system and a high-precision map lane of the road under a universal transverse ink card grid supporting coordinate system UTM coordinate system, where the visual lane detection result includes at least one visual lane, and the high-precision map lane includes at least one high-precision lane; the pairing module 120 is configured to pair the visual lane line and the high-precision lane line to obtain at least one lane line group; the first calculation module 130 is configured to, for each lane group, convert a high-precision lane line in the lane group into a lane point set in the vehicle body coordinate system to obtain a lane point set in the vehicle body coordinate system, and calculate a deviation cost value of the high-precision lane line and a paired visual lane line according to the lane point set and the paired visual lane line; a second calculation module 140 for calculating a correction error based on the deviation cost values of each high-precision lane line and the paired visual lane lines; and a correcting module 150, configured to correct the positioning information according to the correction error.
Please refer to step S11 in the first embodiment for specific implementation of the obtaining module 110, which is not described herein again. Please refer to step S12 in the first embodiment for specific implementation of the pairing module 120, which is not described herein again. Please refer to step S13 in the first embodiment for specific implementation of the first calculating module 130, which is not described herein again. For the specific implementation of the second calculating module 140, refer to step S14 in the first embodiment, which is not described herein again. Please refer to step S15 in the first embodiment for specific implementation of the correcting module 150, which is not described herein again.
It should be noted that the present invention also provides another alternative: the UTM coordinate system and the vehicle body coordinate system adopted by the invention can be replaced according to different standards, for example, the vehicle body coordinate system specified in SAE is different from the invention, and after the coordinate system definition is changed, the conversion matrix in the formula can be changed along with the change, but the whole process is similar; in the invention, when calculating the deviation cost value, the weight of the lane line point is gradually attenuated from near to far. In practical application, the weight definition of each lane line point can be adjusted according to input data such as the accuracy condition of a visual detection result. The invention adopts a particle filtering method to calculate the deviation amount, can also adopt Kalman filtering, bayesian filtering and other methods to calculate, and can even adopt a discrete sampling method to directly calculate the deviation amount.
It should be noted that, the invention adopts the visual lane line inspection result, corrects the accumulated error of DR original information and the jitter error of UTM original information, improves the precision and stability of the positioning system, and can provide stable positioning information for other modules of automatic driving.
EXAMPLE III
The third embodiment of the present invention further provides a nonvolatile computer storage medium, where the computer storage medium stores computer-executable instructions, and the computer-executable instructions can execute the positioning information correction method in any of the above method embodiments; as one embodiment, the non-volatile computer storage medium of the present invention stores computer-executable instructions configured to: acquiring a visual lane line detection result of a current lane line under a vehicle body coordinate system and a high-precision map lane line under a UTM coordinate system; converting the high-precision map lane line to a vehicle body coordinate system to obtain a lane point set in the vehicle body coordinate system; respectively solving the minimum distance from each point in the lane point set to a lane line in the visual lane line detection result to obtain a deviation cost set of the lane point set; and calculating a correction error based on the deviation cost set, wherein the correction error is used for correcting the positioning information needing to be converted relative to the vehicle body coordinate system.
As a non-volatile computer-readable storage medium, may be used to store non-volatile software programs, non-volatile computer-executable programs, and modules, such as program instructions/modules corresponding to the methods in embodiments of the present invention. One or more program instructions are stored in a non-transitory computer readable storage medium, which when executed by a processor, perform the positioning information correction method in any of the above-described method embodiments.
Example four
An embodiment of the present invention further provides an electronic device, which includes: at least one processor, and a memory communicatively coupled to the at least one processor, wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a method of positioning information correction.
EXAMPLE five
In some embodiments, a fifth embodiment of the present invention further provides a mobile device, which includes a body and an electronic apparatus installed on the body according to any one of the foregoing embodiments. The mobile device may be an unmanned vehicle such as an unmanned sweeper, an unmanned ground washing vehicle, an unmanned logistics vehicle, an unmanned passenger vehicle, an unmanned sanitation vehicle, an unmanned small/large bus vehicle, a truck, a mine car, or the like, or may be a robot or the like.
Example six
In some embodiments, the sixth embodiment of the present invention further provides a computer program product, which when run on a computer, causes the computer to execute the method for positioning correction according to any one of the embodiments of the present invention.
In some embodiments, the present invention further provides a computer program product, which includes a computer program stored on a non-volatile computer-readable storage medium, the computer program including program instructions, which, when executed by a computer, cause the computer to perform any one of the above methods for bit information based modification.
EXAMPLE seven
Fig. 7 is a schematic diagram of a hardware structure of an electronic device according to a positioning information correction method provided in another embodiment of the present application, and as shown in fig. 7, the electronic device includes: one or more processors 710 and a memory 720, one processor 710 being illustrated in fig. 7. The apparatus of the positioning information correction method may further include: an input device 730 and an output device 740.
The processor 710, the memory 720, the input device 730, and the output device 740 may be connected by a bus or other means, such as the bus connection in fig. 7.
The memory 720, which is a non-volatile computer-readable storage medium, can be used to store non-volatile software programs, non-volatile computer-executable programs, and modules, such as program instructions/modules corresponding to the positioning information correction method in the embodiments of the present application. The processor 710 executes the non-volatile software program, instructions and modules stored in the memory 720, so as to execute various functional applications and data processing of the server, that is, to implement the positioning information correction method of the above method embodiment.
The memory 720 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data and the like. Further, the memory 720 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, memory 720 optionally includes memory located remotely from processor 710, which may be connected to a mobile device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device, 730, may receive input numeric or character information. The output device 740 may include a display device such as a display screen.
The one or more modules are stored in the memory 720 and when executed by the one or more processors 710 perform the positioning information correction method in any of the method embodiments described above.
The product can execute the method provided by the embodiment of the application, and has the corresponding functional modules and beneficial effects of the execution method. For technical details that are not described in detail in this embodiment, reference may be made to the methods provided in the embodiments of the present application.
The non-volatile computer-readable storage medium may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the device, and the like. Further, the non-volatile computer-readable storage medium may include high speed random access memory, and may also include non-volatile memory, such as at least one disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, the non-transitory computer readable storage medium optionally includes memory located remotely from the processor, which may be connected to the device over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
An embodiment of the present invention further provides an electronic device, which includes: at least one processor, and a memory communicatively coupled to the at least one processor, wherein the memory stores instructions executable by the at least one processor, the instructions being executable by the at least one processor to enable the at least one processor to perform the steps of the positioning information correction method of any of the embodiments of the present invention.
The electronic device of the embodiments of the present application exists in various forms, including but not limited to:
(1) Mobile communication devices, which are characterized by mobile communication capabilities and are primarily targeted at providing voice and data communications. Such terminals include smart phones, multimedia phones, functional phones, and low-end phones, among others.
(2) The ultra-mobile personal computer equipment belongs to the category of personal computers, has calculation and processing functions and generally has the characteristic of mobile internet access. Such terminals include PDA, MID, and UMPC devices, such as tablet computers.
(3) Portable entertainment devices such devices may display and play multimedia content. The devices comprise audio and video players, handheld game consoles, electronic books, intelligent toys and portable vehicle-mounted navigation devices.
(4) Other mobile devices with data processing capabilities.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of another like element in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A method for correcting positioning information comprises the following steps:
acquiring a visual lane line detection result of a road where a current frame self vehicle is located in a vehicle body coordinate system and a high-precision map lane line of a universal transverse ink card grid supporting coordinate system (UTM) coordinate system, wherein the visual lane line detection result comprises at least one visual lane line, and the high-precision map lane line comprises at least one high-precision lane line;
pairing the visual lane line and the high-precision lane line to obtain at least one lane line group;
for each lane line group, converting high-precision lane lines in the lane line group into a lane point set under a vehicle body coordinate system to obtain a lane point set under the vehicle body coordinate system, and calculating the deviation cost value of the high-precision lane lines and paired visual lane lines according to the lane point set and the paired visual lane lines;
calculating a correction error based on the deviation cost value of each high-precision lane line and the paired visual lane lines;
correcting the positioning information according to the correction error;
the positioning information includes original DR positioning information in a DR coordinate system and original UTM positioning information in a UTM coordinate system, and the high-precision lane lines in the group of lane lines are converted into a lane point set in a vehicle body coordinate system to obtain the lane point set in the vehicle body coordinate system, which specifically includes:
acquiring original DR positioning information which is time-synchronous with a current frame visual lane line detection result and a previous frame visual lane line detection result;
calculating to obtain virtual UTM positioning information of the current frame according to UTM positioning information of the previous frame, time synchronization original DR positioning information of the current frame, a conversion matrix of a DR coordinate system and a vehicle body coordinate system and a conversion matrix of the UTM coordinate system and the vehicle body coordinate system;
and converting the high-precision lane line into a lane point set under the vehicle body coordinate system based on the current frame virtual UTM positioning information to obtain the lane point set under the vehicle body coordinate system.
2. The method according to claim 1, wherein calculating the deviation cost value of the high-precision lane line and the paired visual lane lines according to the lane point set and the visual lane lines in the lane line group comprises:
acquiring a plurality of preset discrete error values;
for each discrete error value, overlapping the discrete error value with each point in the lane point set to obtain a sampling point set; and calculating the minimum distance from each point in the sampling point set to the paired visual lane lines, and determining the deviation cost value of the high-precision lane lines and the paired visual lane lines based on the discrete error value according to the minimum distance.
3. The method of claim 2, wherein calculating the correction error based on the deviation cost value of each high-precision map lane line from the corresponding visual lane line comprises:
calculating the sum of deviation cost values of a plurality of high-precision lane lines and paired visual lane lines based on the discrete error values aiming at each discrete error value, and taking the sum as a cost value corresponding to the discrete error value;
calculating the sum of the cost values corresponding to all the discrete error values to obtain a total cost value;
determining weights corresponding to the discrete error values respectively according to the cost values corresponding to the discrete error values and the total cost value;
and obtaining the correction error according to the weight and the cost value of each discrete error value.
4. The method according to claim 1, wherein correcting the positioning information according to the correction error specifically comprises:
calculating to obtain the virtual DR positioning information of the current frame according to the virtual DR positioning information of the previous frame, the original DR positioning information of the time synchronization of the current frame, a conversion matrix between a DR coordinate system and a vehicle body coordinate system and the correction error;
calculating to obtain current-moment DR positioning information according to the current-moment virtual DR positioning information, the current-moment original DR positioning information, the previous-moment original DR positioning information and a conversion matrix between a DR coordinate system and a vehicle body coordinate system;
and calculating to obtain the current frame UTM positioning information according to the previous frame UTM positioning information, the previous frame time synchronization original DR positioning information, the current frame time synchronization original DR positioning information, the conversion matrix between the DR coordinate system and the vehicle body coordinate system, the conversion matrix between the vehicle body coordinate system and the UTM coordinate system and the correction error.
5. The method of claim 1~4 wherein pairing a visual lane line and a high precision lane line results in at least one lane line group comprising:
preliminarily matching the visual lane lines and the high-precision lane lines according to the numbering rule of the visual lane lines and the numbering rule of the high-precision lane lines to obtain at least one initial lane line group;
calculating the difference degree between the high-precision lane line and the visual lane line in each initial lane line group; if the difference degree is larger than a preset threshold value, deleting the initial lane group; and if the difference degree is not greater than the threshold value, determining the initial lane group as the lane group.
6. The method of claim 5, wherein calculating the difference between the high-precision lane lines and the visual lane lines in the initial lane line group comprises:
and converting the high-precision lane lines into a vehicle body coordinate system to obtain a lane point set in the vehicle body coordinate system, and calculating the difference between the high-precision lane lines and the paired visual lane lines according to the lane point set and the paired visual lane lines.
7. A positioning information correction apparatus comprising:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a visual lane line detection result under a vehicle body coordinate system of a road where a current frame of a self vehicle is located and a high-precision map lane line under a universal transverse ink card grid supporting coordinate system (UTM) coordinate system, the visual lane line detection result comprises at least one visual lane line, and the high-precision map lane line comprises at least one high-precision lane line;
the pairing module is used for pairing the visual lane line and the high-precision lane line to obtain at least one lane line group;
the first calculation module is used for converting the high-precision lane lines in each lane line group into a lane point set under a vehicle body coordinate system to obtain a lane point set under the vehicle body coordinate system, and calculating the deviation cost value of the high-precision lane lines and the paired visual lane lines according to the lane point set and the paired visual lane lines;
the second calculation module is used for calculating a correction error based on the deviation cost value of each high-precision lane line and the paired visual lane lines;
the correcting module is used for correcting the positioning information according to the correcting error;
the method comprises the following steps that positioning information comprises original DR positioning information under a DR coordinate system and original UTM positioning information under a UTM coordinate system, high-precision lane lines in the set of lane line groups are converted into a lane point set under a vehicle body coordinate system, and the method specifically comprises the following steps:
acquiring original DR positioning information which is time-synchronous with a current frame visual lane line detection result and a previous frame visual lane line detection result;
calculating to obtain virtual UTM positioning information of the current frame according to UTM positioning information of the previous frame, time synchronization original DR positioning information of the current frame, a conversion matrix of a DR coordinate system and a vehicle body coordinate system and a conversion matrix of the UTM coordinate system and the vehicle body coordinate system;
and converting the high-precision lane line into a lane point set under the vehicle body coordinate system based on the current frame virtual UTM positioning information to obtain the lane point set under the vehicle body coordinate system.
8. An electronic device, comprising: at least one processor, and a memory communicatively coupled to the at least one processor, wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the steps of the method of any of claims 1 to 6.
9. A storage medium on which a computer program is stored which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 6.
10. A mobile tool equipped with a vision sensor, the mobile tool comprising the electronic device according to claim 8, wherein the vision sensor is connected to the electronic device in a communication manner.
CN202211080002.7A 2022-09-05 2022-09-05 Positioning information correction method, electronic device, and storage medium Active CN115143996B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211080002.7A CN115143996B (en) 2022-09-05 2022-09-05 Positioning information correction method, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211080002.7A CN115143996B (en) 2022-09-05 2022-09-05 Positioning information correction method, electronic device, and storage medium

Publications (2)

Publication Number Publication Date
CN115143996A CN115143996A (en) 2022-10-04
CN115143996B true CN115143996B (en) 2023-01-17

Family

ID=83416517

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211080002.7A Active CN115143996B (en) 2022-09-05 2022-09-05 Positioning information correction method, electronic device, and storage medium

Country Status (1)

Country Link
CN (1) CN115143996B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116793369B (en) * 2023-02-10 2024-03-08 北京斯年智驾科技有限公司 Path planning method, device, equipment and computer readable storage medium
CN117490727B (en) * 2023-12-27 2024-03-29 合众新能源汽车股份有限公司 Positioning accuracy evaluation method and device and electronic equipment
CN117490728B (en) * 2023-12-28 2024-04-02 合众新能源汽车股份有限公司 Lane line positioning fault diagnosis method and system

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4043552B2 (en) * 1996-07-29 2008-02-06 ブロードコム・コーポレーシヨン Sampled amplitude read channel
WO2016187759A1 (en) * 2015-05-23 2016-12-01 SZ DJI Technology Co., Ltd. Sensor fusion using inertial and image sensors
CN109297500B (en) * 2018-09-03 2020-12-15 武汉中海庭数据技术有限公司 High-precision positioning device and method based on lane line feature matching
KR20200090527A (en) * 2019-01-21 2020-07-29 현대자동차주식회사 Apparatus for recognizing lane and method thereof
CN111854727B (en) * 2019-04-27 2022-05-13 北京魔门塔科技有限公司 Vehicle pose correction method and device
CN111742326A (en) * 2019-05-22 2020-10-02 深圳市大疆创新科技有限公司 Lane line detection method, electronic device, and storage medium
CN110954113B (en) * 2019-05-30 2021-10-15 北京初速度科技有限公司 Vehicle pose correction method and device
CN112307810B (en) * 2019-07-26 2023-08-04 北京魔门塔科技有限公司 Visual positioning effect self-checking method and vehicle-mounted terminal
CN111242031B (en) * 2020-01-13 2023-08-01 禾多科技(北京)有限公司 Lane line detection method based on high-precision map
KR102296520B1 (en) * 2020-12-18 2021-09-01 주식회사 카비 Method of detecting curved lane through path estimation by monocular vision based camera
CN113701781B (en) * 2021-09-30 2023-07-18 重庆长安汽车股份有限公司 Matching lane searching method based on high-precision map and visual lane lines
CN114396957B (en) * 2022-02-28 2023-10-13 重庆长安汽车股份有限公司 Positioning pose calibration method based on vision and map lane line matching and automobile
CN114670871A (en) * 2022-04-14 2022-06-28 北京京东乾石科技有限公司 Predictive control method and device for automatic driving vehicle

Also Published As

Publication number Publication date
CN115143996A (en) 2022-10-04

Similar Documents

Publication Publication Date Title
CN115143996B (en) Positioning information correction method, electronic device, and storage medium
CN113254569B (en) Positioning deviation rectifying method and device
CN113155139B (en) Vehicle track deviation rectifying method and device and electronic equipment
CN110658542B (en) Method, device, equipment and storage medium for positioning and identifying automatic driving automobile
CN115235500B (en) Lane line constraint-based pose correction method and device and all-condition static environment modeling method and device
CN114926809A (en) Passable area detection method and device, moving tool and storage medium
WO2019238146A2 (en) Onboard terminal-based taxi metered pricing method and system
CN111079533A (en) Unmanned vehicle driving decision method, unmanned vehicle driving decision device and unmanned vehicle
CN111476062A (en) Lane line detection method and device, electronic equipment and driving system
CN111966111B (en) Automatic power distribution based mobile charging equipment formation control method, system and device
CN115291262B (en) Satellite positioning data correction method and device, electronic equipment and storage medium
US20210118251A1 (en) Checkpoint-based tracing for monitoring a robotic system
CN112988931B (en) Method, device, equipment and storage medium for aligning driving track
CN104112302B (en) Vehicle insurance determination method and system based on vehicle driving state
CN113506012A (en) Driving behavior risk index judgment method based on mobile phone Internet of vehicles data
CN109903545A (en) A kind of method and system of car networking data transmission
CN112629553B (en) Vehicle co-location method, system and device under intelligent network connection environment
CN115303291B (en) Trailer trajectory prediction method and device for towed vehicle, electronic device and storage medium
CN117804421A (en) Grid map updating method and related products
CN115371719B (en) Parameter calibration method and device for detection equipment, storage medium and electronic device
CN117908035A (en) Single-line laser positioning method for dynamic scene, mobile device and storage medium
CN115320623B (en) Vehicle trajectory prediction method, apparatus, mobile device, and storage medium
CN115077512A (en) Map track verification method, map track verification equipment, mobile device and storage medium
CN116295391A (en) Picture construction positioning method and related products
CN115269763A (en) Local point cloud map updating and maintaining method and device, mobile tool and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant