CN118010038A - Vehicle position correction method, device, equipment and storage medium - Google Patents

Vehicle position correction method, device, equipment and storage medium Download PDF

Info

Publication number
CN118010038A
CN118010038A CN202211397255.7A CN202211397255A CN118010038A CN 118010038 A CN118010038 A CN 118010038A CN 202211397255 A CN202211397255 A CN 202211397255A CN 118010038 A CN118010038 A CN 118010038A
Authority
CN
China
Prior art keywords
vehicle
moment
lane line
time
estimated position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211397255.7A
Other languages
Chinese (zh)
Inventor
杨占铎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202211397255.7A priority Critical patent/CN118010038A/en
Publication of CN118010038A publication Critical patent/CN118010038A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manufacturing & Machinery (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)

Abstract

The application relates to the technical field of clouds, and provides a vehicle position correction method, device and equipment and a storage medium. The method comprises the following steps: acquiring a first lane line in a first estimated position periphery setting range of the vehicle at the time t and a second lane line identified by the shooting equipment, and acquiring a second estimated position of the vehicle based on a matching result of the first lane line and the second lane line; obtaining a loss function representing a loss of corrected vehicle position based on the second estimated position and a vehicle position line, wherein the vehicle position line is obtained based on the second estimated position and the first lane line, and the position relationship between the vehicle position line and the first lane line meets a parallel condition; and correcting the second estimated position based on the first estimated position and the loss function to obtain a second target position of the vehicle at the time t. The original camera image is not required to be acquired in the correction process, so that the application range is wide, and the universality is high. And determining a loss function based on the second estimated position and the vehicle position line, thereby reducing correction errors.

Description

Vehicle position correction method, device, equipment and storage medium
Technical Field
The application relates to the technical field of clouds, and provides a vehicle position correction method, device and equipment and a storage medium.
Background
The high-precision map is widely applied to the fields of automatic driving, internet of vehicles and the like, and the rapid development of the technologies in the fields of automatic driving, internet of vehicles and the like is promoted, so that the function upgrading of the high-precision map is further promoted. Because the road environment is complex and various, the traffic condition is also changeable instantaneously, and in order to ensure the safe running of the vehicle, the navigation accuracy of the map is improved, and how to correct the position of the vehicle in the high-accuracy map in time becomes a hot spot problem to be solved urgently.
At present, the following two correction modes are commonly adopted, and are respectively:
one is to correct the vehicle position in the high-precision map based on the lane line intercept between the first lane line of the high-precision map and the second lane line of the camera image. However, during the corrective procedure, the vehicle position along the lane line may be subject to a large restriction, resulting in the resulting vehicle position being advanced or retarded from the actual vehicle position.
The other is to project the first lane line of the high-precision map into the camera image, obtain a re-projection error between the first lane line and the second lane line of the original camera image, and correct the vehicle position in the high-precision map based on the re-projection error. However, the correction method is only suitable for a scene where the original camera image can be acquired, has certain limitation and has low universality.
Disclosure of Invention
The embodiment of the application provides a vehicle position correction method, device, equipment and storage medium, which are used for solving the problems of high correction error and vehicle position correction in a scene where an original camera image cannot be acquired.
In a first aspect, an embodiment of the present application provides a method for correcting a vehicle position, including:
Determining a first estimated position of a vehicle at a t moment based on a first target position of the vehicle at the t-1 moment and combining vehicle movement data of the vehicle from the t-1 moment to the t moment, wherein the t moment represents the current moment;
Acquiring a first lane line based on map data in a set range around the first estimated position, identifying a second lane line based on lane line information acquired by a photographing device, and acquiring a second estimated position of the vehicle based on a matching result of the first lane line and the second lane line;
Obtaining a loss function representing a loss of corrected vehicle position based on the second estimated position and a vehicle position line, wherein the vehicle position line is obtained based on the second estimated position and the first lane line, and the position relationship with the first lane line satisfies a parallel condition;
and correcting the second estimated position based on the first estimated position and the loss function to obtain a second target position of the vehicle at the time t.
In a second aspect, an embodiment of the present application further provides an apparatus for correcting a vehicle position, including:
The position estimation unit is used for determining a first estimated position of the vehicle at the t moment according to a first target position of the vehicle at the t-1 moment and combining vehicle movement data of the vehicle from the t-1 moment to the t moment, wherein the t moment represents the current moment;
Acquiring a first lane line based on map data in a set range around the first estimated position, identifying a second lane line based on lane line information acquired by a photographing device, and acquiring a second estimated position of the vehicle based on a matching result of the first lane line and the second lane line;
a position correction unit configured to obtain a loss function representing a loss of corrected vehicle position based on the second estimated position and a vehicle position line, where the vehicle position line is obtained based on the second estimated position and the first lane line, and a positional relationship with the first lane line satisfies a parallel condition;
and correcting the second estimated position based on the first estimated position and the loss function to obtain a second target position of the vehicle at the time t.
In a third aspect, an embodiment of the present application further provides a computer device, including a processor and a memory, where the memory stores program code that, when executed by the processor, causes the processor to perform the steps of any one of the vehicle position correction methods described above.
In a fourth aspect, embodiments of the present application also provide a computer readable storage medium comprising program code for causing a computer device to perform the steps of any one of the vehicle position correction methods described above, when the program product is run on the computer device.
In a fifth aspect, embodiments of the present application also provide a computer program product comprising computer instructions for executing the steps of any one of the vehicle position correction methods described above by a processor.
The application has the following beneficial effects:
The embodiment of the application provides a vehicle position correction method, device, equipment and storage medium, wherein the method comprises the following steps: based on a first target position of the vehicle at a time t-1, determining a first estimated position of the vehicle at the time t by combining vehicle movement data of the vehicle from the time t-1 to the time t, wherein the time t represents the current time; based on map data in a set range around the first estimated position, a first lane line is obtained, based on lane line information obtained by the photographing device, a second lane line is identified, and based on a matching result of the first lane line and the second lane line, a second estimated position of the vehicle is obtained. Obtaining a loss function representing a loss of corrected vehicle position based on the second estimated position and a vehicle position line, wherein the vehicle position line is obtained based on the second estimated position and the first lane line, and the position relationship between the vehicle position line and the first lane line meets a parallel condition; and finally, correcting the second estimated position based on the first estimated position and the loss function to obtain a second target position of the vehicle at the time t.
Based on map data in a first estimated position periphery setting range, a first lane line generated by a map is obtained, based on lane line information output by a shooting device, a second lane line recognized by the shooting device is displayed on the map, and compared with a scheme of projecting the first lane line to a camera image, the embodiment of the application projects the second lane line to a high-precision map, even in a scene where an original camera image cannot be obtained, the scheme can be used for position correction, and the application range is wide, the restriction is low, and the universality is high.
In addition, the embodiment of the application determines the vertical distance vertical to the vehicle position line as a loss function based on the second estimated position and the vehicle position line, thereby overcoming the constraint on the vehicle position along the lane line direction in the correction process, reducing the correction error and further improving the map navigation precision.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application. The objectives and other advantages of the application will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1A is a schematic illustration of a rough position P1 of a vehicle on a high-precision map;
FIG. 1B is a schematic view of a first lane line and a second lane line around a vehicle;
FIG. 1C is a schematic illustration of a candidate position P2 of a vehicle on a high-precision map;
FIG. 1D is a diagram illustrating the error ranges of P1 and P2;
Fig. 1E is a schematic view of a target position P3 of a vehicle on a high-precision map;
FIG. 2 is a schematic diagram of an overlaid image;
FIG. 3 is an alternative schematic diagram of an application scenario in an embodiment of the present application;
FIG. 4A is a schematic flow chart of correcting a vehicle position according to an embodiment of the present application;
FIG. 4B is a flowchart illustrating a method for determining a first estimated position according to an embodiment of the present application;
FIG. 4C is a schematic diagram of a uniform acceleration motion according to an embodiment of the present application;
FIG. 4D is a schematic diagram of a first lane line and a second lane line matching logic according to an embodiment of the present application;
FIG. 4E is a schematic diagram of a logic diagram for obtaining a vehicle position line according to an embodiment of the present application;
FIG. 4F is a schematic view of an included angle provided by an embodiment of the present application;
Fig. 5 is a schematic flow chart of a position correction of a vehicle exiting a tunnel according to an embodiment of the present application;
FIG. 6 is a schematic structural diagram of a vehicle position correction device according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a computer device according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a computing device according to an embodiment of the application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the technical solutions of the present application, but not all embodiments. All other embodiments, based on the embodiments described in the present document, which can be obtained by a person skilled in the art without any creative effort, are within the scope of protection of the technical solutions of the present application.
Some terms in the embodiments of the present application are explained below to facilitate understanding by those skilled in the art.
1. Cloud technology (Cloud technology):
the application relates to the field of cloud technology, in particular to a hosting technology for integrating hardware, software, network and other series resources in a wide area network or a local area network to realize calculation, storage, processing and sharing of data.
Specifically, the Cloud technology is a generic term of network technology, information technology, integration technology, management platform technology, application technology and the like applying a Cloud computing (Cloud computing) business model, and a resource pool formed by the Cloud technology is used as required, so that the Cloud computing system is more flexible and convenient.
The cloud computing technology becomes an important support of the cloud technology, and mainly solves the problem that a large amount of computing and storage resources are needed in background services of a cloud technology network system, including but not limited to video websites, picture websites and more portal websites. With the high development and application of the internet industry, each article may have an own identification mark in the future, and needs to be transmitted to a background system for logic processing, and data of different levels can be processed separately, and various industry data need a powerful system rear shield as a technical support, which can only be realized through cloud computing.
2. Cloud computing:
Narrow cloud computing refers to the delivery and usage model of an information technology (Information Technology, IT) infrastructure, which is to obtain the required resources in an on-demand, easily scalable manner over a network; the generalized cloud computing refers to a service delivery and use mode, and a required service is obtained in an on-demand and easily-expandable manner through a network, and the service can be related to IT, software and the Internet, or can be other services.
Cloud Computing is a product of fusion of traditional computer and network technology developments such as Grid Computing (Grid Computing), distributed Computing (DistributedComputing), parallel Computing (Parallel Computing), utility Computing (Utility Computing), network storage (Network Storage Technologies), virtualization (Virtualization), load balancing (Load Balance), and the like. With the development of the internet, real-time data flow and diversification of connected devices, and the pushing of demands of search services, social networks, mobile commerce, open collaboration and the like, cloud computing has been rapidly developed. Different from the previous parallel distributed computing, the birth of cloud computing is to push the revolutionary transformation of the whole Internet mode and the enterprise management mode in concept.
3. High precision map: also called high resolution Map (HD Map, high Definition Map), is a Map that is dedicated to the service of unmanned vehicles. Unlike conventional navigation maps, high-precision maps can provide navigation information at the Lane (Lane) level in addition to Road (Road) level navigation information. Both in terms of the richness of the information and the accuracy of the information, are far higher than conventional navigational maps.
4. Lane line: refers to a lane marking that guides the direction. The method can be used for indicating vehicles to drive in the indicated direction at the entrance section of the intersection, defining the driving direction, and each driving lane to alleviate traffic pressure.
5. And (3) point cloud: is a massive point set expressing the target spatial distribution and the target surface characteristics under the same spatial reference system. The point cloud contains rich attribute information such as: three-dimensional coordinates (X, Y, Z), color, classification values, intensity values, time, etc.
The following briefly describes the design concept of the embodiment of the present application:
The high-precision map is widely applied to the fields of automatic driving, internet of vehicles and the like, and the rapid development of the technologies in the fields of automatic driving, internet of vehicles and the like is promoted, so that the function upgrading of the high-precision map is further promoted. Because the road environment is complex and various, the traffic condition is also changeable instantaneously, and in order to ensure the safe running of the vehicle, the navigation accuracy of the map is improved, and how to correct the position of the vehicle in the high-accuracy map in time becomes a hot spot problem to be solved urgently.
At present, the following two correction modes are commonly adopted, and are respectively:
one is to correct the vehicle position in the high-precision map based on the lane line intercept between the first lane line of the high-precision map and the second lane line of the camera image. The specific process is as follows:
First, based on the time stamp carried in the lane line information output by the camera, the rough position P1 of the vehicle at time t is determined in the high-precision map shown in fig. 1A in combination with global positioning system (Global Positioning System, GPS) data, speedometer data, and inertial measurement unit (Inertial measurement unit, IMU) data between time t-1 and time t. Next, as shown in fig. 1B, map data around P1 is acquired to obtain a first lane line of the high-precision map, and a second lane line is identified based on the lane line information output by the camera.
And then, obtaining the lane line intercept between the first lane line of the high-precision map and the second lane line of the camera image, and translating P1 along the direction of the vertical lane line according to the lane line intercept, so that the second lane line perceived by the camera coincides with the first lane line of the high-precision map, and obtaining a candidate position P2 shown in FIG. 1C.
The white area of fig. 1D is the error range of P1, and the gray area of fig. 1D is the error range of P2. As can be seen from fig. 1D, the error range of P1 is larger, while the error range of P2 is reduced. Therefore, P2 is taken as an observation coordinate point, an observation noise covariance matrix is generated by combining the lane line direction, and the P2 and the observation noise covariance matrix are taken as observation parameters of a fusion model, so that a target position P3 of the vehicle at the time t is obtained on a high-precision map as shown in fig. 1E.
However, during the corrective procedure, the vehicle position along the lane line may be subject to a large restriction, resulting in the resulting vehicle position being advanced or retarded from the actual vehicle position.
The other is to project the first lane line of the high-precision map into the camera image to obtain an overlapping image as shown in fig. 2, obtain a reprojection error between the first lane line and the second lane line of the original camera image, and correct the vehicle position in the high-precision map based on the reprojection error. However, the correction method is only suitable for a scene where the original camera image can be acquired, has certain limitation and has low universality.
In view of this, the embodiments of the present application provide a method, apparatus, device and storage medium for correcting a vehicle position. The method comprises the following steps: based on a first target position of the vehicle at a time t-1, determining a first estimated position of the vehicle at the time t by combining vehicle movement data of the vehicle from the time t-1 to the time t, wherein the time t represents the current time; based on map data in a set range around the first estimated position, a first lane line is obtained, based on lane line information obtained by the photographing device, a second lane line is identified, and based on a matching result of the first lane line and the second lane line, a second estimated position of the vehicle is obtained. Obtaining a loss function representing a loss of corrected vehicle position based on the second estimated position and a vehicle position line, wherein the vehicle position line is obtained based on the second estimated position and the first lane line, and the position relationship between the vehicle position line and the first lane line meets a parallel condition; and finally, correcting the second estimated position based on the first estimated position and the loss function to obtain a second target position of the vehicle at the time t.
Based on map data in a first estimated position periphery setting range, a first lane line generated by a map is obtained, based on lane line information output by a shooting device, a second lane line recognized by the shooting device is displayed on the map, and compared with a scheme of projecting the first lane line to a camera image, the embodiment of the application projects the second lane line to a high-precision map, even in a scene where an original camera image cannot be obtained, the scheme can be used for position correction, and the application range is wide, the restriction is low, and the universality is high.
In addition, the embodiment of the application determines the vertical distance vertical to the vehicle position line as a loss function based on the second estimated position and the vehicle position line, thereby overcoming the constraint on the vehicle position along the lane line direction in the correction process, reducing the correction error and further improving the map navigation precision.
The preferred embodiments of the present application will be described below with reference to the accompanying drawings of the specification, it being understood that the preferred embodiments described herein are for illustration and explanation only, and not for limitation of the present application, and embodiments of the present application and features of the embodiments may be combined with each other without conflict.
User terminals include, but are not limited to, cell phones, computers, intelligent voice interaction devices, intelligent home appliances, vehicle terminals, aircraft, and the like. The embodiment of the invention can be applied to various scenes, including but not limited to cloud technology, artificial intelligence, intelligent transportation, auxiliary driving and the like.
Fig. 3 shows one of the application scenarios, which includes: two physical end devices 310 and a server 330. Each physical terminal device 310 establishes a wired network connection or a wireless network connection with the server 330.
The physical terminal device 310 is a computer device used by a user, including but not limited to a mobile phone, a computer, an intelligent voice interaction device, an intelligent home appliance, a vehicle-mounted terminal, an aircraft, etc.
The server 330 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server for providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, a content distribution network (Content Delivery Network, CDN), basic cloud computing services such as big data and an artificial intelligence platform, which is not limited herein.
The physical terminal device 310 opens a map navigation client 320 installed locally to the device in response to a user-triggered application launch operation. And then, in response to the navigation operation triggered by the user, the map and the navigation route around the vehicle are displayed on the map navigation client 320.
In order to ensure safe driving of the vehicle and improve the map navigation accuracy, a vehicle position correction system is deployed on the map navigation client 320 to correct the position of the vehicle on the high-accuracy map in real time. The correction process of the vehicle position correction system is as follows:
First, based on a first target position of a vehicle at a time t-1, a first estimated position of the vehicle at the time t is determined by combining vehicle movement data of the vehicle from the time t-1 to the time t, wherein the time t represents the current time.
Then, the server 330 is accessed through the map navigation client 320, map data in a set range around the first estimated position is downloaded, a first lane line is obtained based on the corresponding map data, a second lane line is identified based on lane line information obtained by the photographing apparatus, and a second estimated position of the vehicle is obtained based on a matching result of the first lane line and the second lane line.
And obtaining a loss function representing the loss of the corrected vehicle position based on the second estimated position and the vehicle position line, and correcting the second estimated position based on the first estimated position and the loss function to obtain a second target position of the vehicle at the time t. The vehicle position line is obtained based on the second estimated position and the first lane line, and the position relationship between the vehicle position line and the first lane line meets the parallel condition.
Finally, the vehicle position correction system feeds back the corrected second target position to the map navigation client 320, and displays the position in the map of the client.
Next, referring to a flowchart shown in fig. 4A, a method for correcting a vehicle position according to an embodiment of the present application is described.
S401: the vehicle position correction system determines a first estimated position of the vehicle at the time t based on a first target position of the vehicle at the time t-1 and combining vehicle movement data of the vehicle from the time t-1 to the time t, wherein the time t represents the current time.
A plurality of sensors such as GPS, real-time dynamic positioning (Real-TIME KINEMATIC, RTK), speedometer, IMU and the like are deployed in the vehicle. The vehicle position correction system obtains a first target position of the vehicle at the time t-1 by reading positioning data acquired by a GPS or an RTK, and obtains real-time speed of the vehicle from the time t-1 to the time t by vehicle speed data acquired by a vehicle speed meter or an odometer.
The IMU includes an accelerometer for obtaining acceleration of the vehicle from time t-1 to time t and a gyroscope for obtaining angular velocity of the vehicle from time t-1 to time t. The real-time speed, acceleration and angular speed of the vehicle from the time t-1 to the time t are also called vehicle movement data.
The unit of time may be seconds(s), minutes (min), or hours (h), but in order to ensure the real-time performance of the position correction, the unit of time is generally seconds.
The vehicle position correction system may determine the first estimated position of the vehicle at time t by any one of:
as shown in fig. 4B, the procedure of the first mode is as follows:
s4011: the vehicle position correction system determines the travel distance and the posture change of the vehicle from the time t-1 to the time t based on the vehicle movement data of the vehicle from the time t-1 to the time t.
The accelerometer is limited by the structural design of the accelerometer, and when the acceleration of the vehicle from the time t-1 to the time t is measured, the acceleration of each history period can be accumulated continuously, so that the measurement accuracy of the acceleration is reduced. Therefore, before the travel distance of the vehicle from the time t-1 to the time t is determined based on the acceleration measured by the accelerometer, the acceleration of the vehicle from the time t-1 to the time t is corrected by the real-time speed of the vehicle from the time t-1 to the time t. Wherein the real-time speed is obtained based on vehicle speed data collected by a speedometer or odometer.
And determining the driving distance of the vehicle from the time t-1 to the time t based on the corrected acceleration and the driving time from the time t-1 to the time t. Finally, the vehicle position correction system determines the posture change of the vehicle from the time t-1 to the time t through the angular speed of the vehicle from the time t-1 to the time t.
The process of determining the driving distance of the vehicle from the time t-1 to the time t is as follows:
When the vehicle is in uniform acceleration motion, the influence of acceleration on speed over time can be represented by a curve shown in fig. 4C, in which the slope of the diagonal line represents acceleration. Refining the time intervals of fig. 4C, it can be found that the displacement of the movement in each time interval is the area of a small rectangle (i.e. detV x detT). When the time interval is refined to infinity, the displacement (distance) of the movement is found to be equal to the triangle area, i.e. the travel distance is equal to the area under the speed curve.
After time is thinned, the process of summing up and calculating the area under the total oblique line is called integration. Therefore, the embodiment of the application can determine the driving distance from the time t-1 to the time t by integrating the corrected acceleration and the driving time.
S4012: the vehicle position correction system determines a first estimated position of the vehicle at the time t based on a first target position of the vehicle at the time t-1 and combining the driving distance and the posture change.
And superposing the coordinates of the first target position of the vehicle at the time t-1 with the driving distance to obtain the coordinates of the first estimated position of the vehicle at the time t. And then according to the posture change, regulating the posture of the first target position of the vehicle at the time t-1, and determining the posture of the first estimated position of the vehicle at the time t.
For example, when the coordinates of the vehicle at time t-1 are (21, 54,0), the travel distance in the x-direction is 7m, the travel distance in the y-direction is 10m, and the travel distance in the z-direction is 0m, the coordinates of the vehicle at time t are (28, 64, 0). The posture of the vehicle at the time t-1 is running north, and the angular velocity of the vehicle at the time t-1 to the time t is changed to rotate 45 degrees clockwise, so that the posture of the vehicle at the time t is running north-east.
Mode two: and determining a first estimated position of the vehicle at the time t by using the optimization model.
The optimization model refers to a model constructed based on a Kalman filtering algorithm, graph optimization or other optimization algorithm and is used for predicting a first estimated position of the vehicle on the high-precision map and correcting a second target position of the vehicle on the high-precision map.
The optimization model predicts a candidate position of the vehicle at the time t based on inertial measurement data in the vehicle movement data, corrects the candidate position based on the first target position and other data in the vehicle movement data, and determines a first estimated position of the vehicle at the time t.
For easy understanding, taking an optimization model constructed based on a Kalman filtering algorithm as an example, a specific process of obtaining the first estimated position is described.
The Kalman filtering algorithm comprises the following steps: prediction equations and observation equations. And predicting the candidate position of the vehicle at the time t by taking the inertia measurement data in the vehicle movement data as parameters of a prediction equation (shown in a formula 1). And then, correcting the candidate position by taking the first target position and other data in the vehicle movement data as parameters of an observation equation (shown as a formula 2), and determining a first estimated position of the vehicle at the time t.
X k=Fkxk-1+Bkuk+wk equation 1;
x k is the candidate position of the vehicle at time t, F k and B k are motion models of the vehicle at time t, x k-1 is the first target position of the vehicle at time t-1, u k is inertial measurement data of the vehicle from time t-1 to time t, and w k is input noise of the vehicle at time t.
Z k=Hkxk+vk equation 2;
z k is the first estimated position of the vehicle at time t, x k is the candidate position of the vehicle at time t, H k is other data of the vehicle from time t-1 to time t, v k is the observed noise of the vehicle at time t, and this value is determined according to the identification error value of the photographing device.
S402: the vehicle position correction system obtains a first lane line based on map data in a set range around the first estimated position, identifies a second lane line based on lane line information obtained by the photographing device, and obtains a second estimated position of the vehicle based on a result of matching the first lane line and the second lane line.
The first lane line is matched with the second lane line in the following way:
The following operations are circularly executed for the first lane line and the second lane line until the lane line intercept between the first lane line and the second lane line is smaller than a set lane line threshold value, and the matching success is determined:
sampling the first lane line to obtain a first sampling point group, and sampling the second lane line to obtain a second sampling point group;
respectively determining sampling point distances between each first sampling point in the first sampling point group and adjacent second sampling points in the second sampling point group;
and determining the moving distance of the second vehicle road line based on the obtained distance of each sampling point, and moving the second vehicle road line according to the moving distance.
For ease of understanding, taking the lane lines shown in fig. 4D as an example, the process of performing one-time matching of two lane lines is as follows.
And sampling the first lane line to obtain a first point cloud (shown as a black diamond), and sampling the second lane line to obtain a second point cloud (shown as a gray diamond). And (3) registering two adjacent point clouds in the first point cloud and the second point cloud respectively by using an iterative neighbor matching (ITERATIVE CLOSEST POINT, ICP) algorithm to obtain a plurality of point cloud distances. And determining the moving distance of the second vehicle lane line based on the obtained cloud distance of each point, and moving the second vehicle lane line according to the moving distance.
S403: the vehicle position correction system obtains a loss function representing a loss of corrected vehicle position based on the second estimated position and a vehicle position line, wherein the vehicle position line is obtained based on the second estimated position and the first lane line, and the position relationship with the first lane line meets a parallel condition.
As shown in fig. 4E, a second estimated position of the vehicle and a vehicle position line (shown as black dotted line in fig. 4E) are obtained by matching the first lane line and the second lane line. The vehicle position line is a straight line that passes through the point P2 and is parallel to the first lane line. In order to overcome the constraint on the vehicle position along the lane line direction in the correction process, the embodiment of the application corrects the second estimated position of the vehicle by taking the vertical distance perpendicular to the vehicle position line as a loss function so as to reduce the correction error range.
The vector of the direction of the vehicle position line is cross multiplied with the vector P2P3 to obtain a new vector, the z value in which represents the vertical distance perpendicular to the vehicle position line. The specific solving process is as follows:
As shown in fig. 4F, a coordinate system is established with the second estimated position as the origin, and an included angle between the vehicle position line and the horizontal axis of the coordinate system is obtained; based on the second estimated position and the angle, a loss function is obtained that characterizes a loss of corrected vehicle position, as shown in equation 3.
E is a loss function, m= [ 0.1 ], is a template matrix for obtaining z-values,Is a unit vector of the vehicle position line. The coordinates of P2 are (x 2,y2, 0) and the coordinates of the second target position P3 are (x 3,y3, 0).
S404: and correcting the second estimated position by the vehicle position correction system based on the first estimated position and the loss function to obtain a second target position of the vehicle at the time t.
And correcting the second estimated position by using the optimization model to obtain a second target position of the vehicle at the time t. From the foregoing description, it can be seen that the optimization model is a model constructed from a kalman filter algorithm, graph optimization, or other optimization algorithm. For ease of understanding, a specific procedure for obtaining the second target position will be described taking an optimization model constructed based on a kalman filter algorithm as an example.
And taking the first estimated position of the vehicle at the time t as a parameter of a prediction equation (shown as a formula 4) to predict the second estimated position of the vehicle at the time t. And then taking the first estimated position and the loss function as parameters of an observation equation (shown as a formula 5) to determine a second target position of the vehicle at the time t.
X k′=Fkxk-1+Bkzk+wk equation 4;
x k' is the second estimated position of the vehicle at time t, F k and B k are motion models of the vehicle at time t, x k-1 is the first target position of the vehicle at time t-1, z k is the first estimated position of the vehicle at time t, and w k is the input noise of the vehicle at time t.
Z k′=exk′+vk equation 5;
z k 'is the second target position of the vehicle at time t, x k' is the second estimated position of the vehicle at time t, e is the loss function, v k is the observed noise of the vehicle at time t, which is determined from the identification error value of the camera.
When the mobile communication uses radio waves to transmit information, the running condition is very complex, and the mobile communication is easily influenced by factors such as object shielding, signal transmission attenuation and the like in the communication environment, so that multipath effect and shadow effect are generated, and amplitude fading and time delay expansion of the radio wave propagation are caused.
When a vehicle runs in a tunnel, the problems of serious object shielding and signal transmission attenuation can be solved, so that the vehicle position is difficult to accurately position by a high-precision map. Therefore, after the vehicle exits the tunnel, as shown in fig. 5, the correction method provided by the embodiment of the application is adopted to realize the function of quickly correcting the vehicle position on the high-precision map.
S501: the vehicle position correction system obtains a first target position of the vehicle at the time t-1 and vehicle movement data of the vehicle from the time t-1 to the time t through various sensors in the vehicle;
S502: the vehicle position correction system determines a first estimated position of the vehicle at the time t based on a first target position of the vehicle at the time t-1 and combining vehicle movement data of the vehicle from the time t-1 to the time t;
s503: the vehicle position correction system obtains a first lane line based on map data in a set range around the first estimated position, and identifies a second lane line based on lane line information obtained by the photographing device;
S504: the vehicle position correction system respectively carries out sampling processing on two lane lines to obtain two groups of sampling points, and registers sampling points adjacent to each other in positions of the two groups of sampling points by using an ICP algorithm to obtain a plurality of sampling point distances;
s505: the vehicle position correction system determines the moving distance of the second vehicle lane line based on the obtained cloud distance of each point, and moves the second vehicle lane line according to the moving distance;
s506: the vehicle position correction system obtains a second estimated position of the vehicle based on a matching result of the first lane line and the second lane line;
S507: the vehicle position correction system obtains a loss function representing a loss of corrected vehicle position based on the second estimated position and the vehicle position line;
S508: and correcting the second estimated position by the vehicle position correction system based on the first estimated position and the loss function to obtain a second target position of the vehicle at the time t.
The embodiment of the application also provides a correction device for the vehicle position based on the same conception as the embodiment of the method. As shown in fig. 6, the vehicle position orthotic device 600 may include:
a position estimation unit 601, configured to determine a first estimated position of the vehicle at a time t based on a first target position of the vehicle at a time t-1, in combination with vehicle movement data of the vehicle from the time t-1 to the time t, where the time t represents a current time;
obtaining a first lane line based on map data in a set range around the first estimated position, identifying a second lane line based on lane line information obtained by the photographing device, and obtaining a second estimated position of the vehicle based on a matching result of the first lane line and the second lane line;
A position correction unit 602, configured to obtain a loss function representing a loss of corrected vehicle position based on the second estimated position and a vehicle position line, where the vehicle position line is obtained based on the second estimated position and the first lane line, and a positional relationship with the first lane line satisfies a parallel condition;
and correcting the second estimated position based on the first estimated position and the loss function to obtain a second target position of the vehicle at the time t.
Optionally, the position correction unit 602 is configured to:
Establishing a coordinate system by taking the second estimated position as an origin to obtain an included angle between a vehicle position line and a horizontal axis of the coordinate system;
Based on the second estimated position and the angle, a loss function is obtained that characterizes a loss of corrected vehicle position.
Optionally, the position estimating unit 601 determines the first estimated position of the vehicle at the time t by adopting any one of the following manners:
Determining the running distance and the posture change of the vehicle from the t-1 moment to the t moment based on the vehicle movement data of the vehicle from the t-1 moment to the t moment, and determining a first estimated position of the vehicle from the t moment based on a first target position of the vehicle from the t-1 moment and combining the running distance and the posture change;
And predicting a candidate position of the vehicle at the time t based on inertia measurement data in the vehicle movement data, correcting the candidate position based on the first target position and other data in the vehicle movement data, and determining a first estimated position of the vehicle at the time t.
Optionally, the location estimating unit 601 is configured to:
Correcting the acceleration of the vehicle from the t-1 moment to the t moment by the real-time speed of the vehicle from the t-1 moment to the t moment, and determining the driving distance of the vehicle from the t-1 moment to the t moment by combining the driving time from the t-1 moment to the t moment based on the corrected acceleration;
And determining the attitude change of the vehicle from the time t-1 to the time t through the angular speed of the vehicle from the time t-1 to the time t.
Optionally, the location estimating unit 601 is configured to:
superposing the coordinates of the first target position of the vehicle at the time t-1 with the driving distance to obtain the coordinates of the first estimated position of the vehicle at the time t;
and according to the posture change, regulating the posture of the first target position of the vehicle at the time t-1, and determining the posture of the first estimated position of the vehicle at the time t.
Optionally, the position estimating unit 601 matches the first lane line with the second lane line in the following manner:
The following operations are circularly executed for the first lane line and the second lane line until the lane line intercept between the first lane line and the second lane line is smaller than a set lane line threshold value, and the matching success is determined:
sampling the first lane line to obtain a first sampling point group, and sampling the second lane line to obtain a second sampling point group;
respectively determining sampling point distances between each first sampling point in the first sampling point group and adjacent second sampling points in the second sampling point group;
and determining the moving distance of the second vehicle road line based on the obtained distance of each sampling point, and moving the second vehicle road line according to the moving distance.
For convenience of description, the above parts are described as being functionally divided into modules (or units) respectively. Of course, the functions of each module (or unit) may be implemented in the same piece or pieces of software or hardware when implementing the present application.
Having described the method and apparatus for correcting the position of a vehicle according to an exemplary embodiment of the present application, next, a computer device according to another exemplary embodiment of the present application is described.
Those skilled in the art will appreciate that the various aspects of the application may be implemented as a system, method, or program product. Accordingly, aspects of the application may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
Based on the same inventive concept as the above-mentioned method embodiment, a computer device is further provided in the embodiment of the present application, and referring to fig. 7, a computer device 700 may include at least a processor 701 and a memory 702. The memory 702 stores program code that, when executed by the processor 701, causes the processor 701 to perform the steps of any one of the vehicle position correction methods described above.
In some possible implementations, a computing device according to the application may include at least one processor, and at least one memory. Wherein the memory stores program code that, when executed by the processor, causes the processor to perform the steps in the vehicle position correction method according to various exemplary embodiments of the application described hereinabove. For example, the processor may perform the steps as shown in fig. 4A.
A computing device 800 according to such an embodiment of the application is described below with reference to fig. 8. The computing device 800 of fig. 8 is only one example and should not be taken as limiting the functionality and scope of use of embodiments of the present application.
As shown in fig. 8, computing device 800 is in the form of a general purpose computing device. Components of computing device 800 may include, but are not limited to: the at least one processing unit 801, the at least one memory unit 802, and a bus 803 connecting the different system components (including the memory unit 802 and the processing unit 801).
Bus 803 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, a processor, and a local bus using any of a variety of bus architectures.
The storage unit 802 may include readable media in the form of volatile memory, such as Random Access Memory (RAM) 8021 and/or cache storage unit 8022, and may further include Read Only Memory (ROM) 8023.
The storage unit 802 may also include a program/utility 8025 having a set (at least one) of program modules 8024, such program modules 8024 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
The computing device 800 may also communicate with one or more external devices 804 (e.g., keyboard, pointing device, etc.), one or more devices that enable a user to interact with the computing device 800, and/or any devices (e.g., routers, modems, etc.) that enable the computing device 800 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 805. Moreover, computing device 800 may also communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet, through network adapter 806. As shown, network adapter 806 communicates with other modules for computing device 800 over bus 803. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with computing device 800, including but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
Based on the same inventive concept as the above-described method embodiments, aspects of the vehicle position correction method provided by the present application may also be implemented in the form of a program product comprising program code for causing a computer device to perform the steps of the vehicle position correction method according to the various exemplary embodiments of the present application described in the present specification when the program product is run on the computer device, for example, the computer device may perform the steps as shown in fig. 4A.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the spirit or scope of the application. Thus, it is intended that the present application also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (15)

1. A method of correcting a vehicle position, comprising:
Determining a first estimated position of a vehicle at a t moment based on a first target position of the vehicle at the t-1 moment and combining vehicle movement data of the vehicle from the t-1 moment to the t moment, wherein the t moment represents the current moment;
Acquiring a first lane line based on map data in a set range around the first estimated position, identifying a second lane line based on lane line information acquired by a photographing device, and acquiring a second estimated position of the vehicle based on a matching result of the first lane line and the second lane line;
Obtaining a loss function representing a loss of corrected vehicle position based on the second estimated position and a vehicle position line, wherein the vehicle position line is obtained based on the second estimated position and the first lane line, and the position relationship with the first lane line satisfies a parallel condition;
and correcting the second estimated position based on the first estimated position and the loss function to obtain a second target position of the vehicle at the time t.
2. The method of claim 1, wherein the obtaining a loss function indicative of a loss of corrected vehicle position based on the second predicted position and vehicle position line comprises:
establishing a coordinate system by taking the second estimated position as an origin, and obtaining an included angle between the vehicle position line and a horizontal axis of the coordinate system;
And obtaining the loss function representing the loss of the corrected vehicle position based on the second estimated position and the included angle.
3. The method of claim 1, wherein the determining the first estimated position of the vehicle at the time t based on the first target position of the vehicle at the time t-1 in combination with vehicle movement data of the vehicle from the time t-1 to the time t comprises any one of:
Determining a driving distance and a posture change of the vehicle from the t-1 moment to the t moment based on vehicle movement data of the vehicle from the t-1 moment to the t moment, and determining a first estimated position of the vehicle at the t moment based on a first target position of the vehicle at the t-1 moment and combining the driving distance and the posture change;
and predicting a candidate position of the vehicle at the time t based on inertia measurement data in the vehicle movement data, correcting the candidate position based on the first target position and other data in the vehicle movement data, and determining a first estimated position of the vehicle at the time t.
4. The method of claim 3, wherein the determining a distance traveled and a change in attitude of the vehicle from the time t-1 to the time t based on vehicle movement data of the vehicle from the time t-1 to the time t comprises:
correcting the acceleration of the vehicle from the t-1 moment to the t moment through the real-time speed of the vehicle from the t-1 moment to the t moment, and determining the driving distance of the vehicle from the t-1 moment to the t moment by combining the driving time from the t-1 moment to the t moment based on the corrected acceleration;
And determining the posture change of the vehicle from the t-1 moment to the t moment through the angular speed of the vehicle from the t-1 moment to the t moment.
5. The method of claim 3, wherein the determining a first estimated position of the vehicle at the time t based on the first target position of the vehicle at the time t-1 in combination with the distance traveled and the attitude change comprises:
superposing the coordinate of the first target position of the vehicle at the t-1 moment and the driving distance to obtain the coordinate of the first estimated position of the vehicle at the t moment;
And adjusting the posture of the first target position of the vehicle at the t-1 moment according to the posture change, and determining the posture of the first estimated position of the vehicle at the t moment.
6. The method of claim 1, wherein the first lane line and the second lane line are matched in the following manner:
the following operations are circularly executed for the first lane line and the second lane line until the lane line intercept between the first lane line and the second lane line is smaller than a set lane line threshold value, and the matching is determined to be successful:
Sampling the first lane line to obtain a first sampling point group, and sampling the second lane line to obtain a second sampling point group;
respectively determining sampling point distances between each first sampling point in the first sampling point group and adjacent second sampling points in the second sampling point group;
And determining the moving distance of the second lane line based on the obtained distance of each sampling point, and moving the second lane line according to the moving distance.
7. An orthotic device for a vehicle location, comprising:
The position estimation unit is used for determining a first estimated position of the vehicle at the t moment according to a first target position of the vehicle at the t-1 moment and combining vehicle movement data of the vehicle from the t-1 moment to the t moment, wherein the t moment represents the current moment;
Acquiring a first lane line based on map data in a set range around the first estimated position, identifying a second lane line based on lane line information acquired by a photographing device, and acquiring a second estimated position of the vehicle based on a matching result of the first lane line and the second lane line;
a position correction unit configured to obtain a loss function representing a loss of corrected vehicle position based on the second estimated position and a vehicle position line, where the vehicle position line is obtained based on the second estimated position and the first lane line, and a positional relationship with the first lane line satisfies a parallel condition;
and correcting the second estimated position based on the first estimated position and the loss function to obtain a second target position of the vehicle at the time t.
8. The apparatus of claim 7, wherein the position correction unit is to:
establishing a coordinate system by taking the second estimated position as an origin, and obtaining an included angle between the vehicle position line and a horizontal axis of the coordinate system;
And obtaining the loss function representing the loss of the corrected vehicle position based on the second estimated position and the included angle.
9. The apparatus of claim 7, wherein the position estimation unit determines the first estimated position of the vehicle at the time t by any one of:
Determining a driving distance and a posture change of the vehicle from the t-1 moment to the t moment based on vehicle movement data of the vehicle from the t-1 moment to the t moment, and determining a first estimated position of the vehicle at the t moment based on a first target position of the vehicle at the t-1 moment and combining the driving distance and the posture change;
and predicting a candidate position of the vehicle at the time t based on inertia measurement data in the vehicle movement data, correcting the candidate position based on the first target position and other data in the vehicle movement data, and determining a first estimated position of the vehicle at the time t.
10. The apparatus of claim 9, wherein the position estimation unit is configured to:
correcting the acceleration of the vehicle from the t-1 moment to the t moment through the real-time speed of the vehicle from the t-1 moment to the t moment, and determining the driving distance of the vehicle from the t-1 moment to the t moment by combining the driving time from the t-1 moment to the t moment based on the corrected acceleration;
And determining the posture change of the vehicle from the t-1 moment to the t moment through the angular speed of the vehicle from the t-1 moment to the t moment.
11. The apparatus of claim 9, wherein the position estimation unit is configured to:
superposing the coordinate of the first target position of the vehicle at the t-1 moment and the driving distance to obtain the coordinate of the first estimated position of the vehicle at the t moment;
And adjusting the posture of the first target position of the vehicle at the t-1 moment according to the posture change, and determining the posture of the first estimated position of the vehicle at the t moment.
12. The apparatus of claim 7, wherein the position estimation unit matches the first lane line with the second lane line by:
the following operations are circularly executed for the first lane line and the second lane line until the lane line intercept between the first lane line and the second lane line is smaller than a set lane line threshold value, and the matching is determined to be successful:
Sampling the first lane line to obtain a first sampling point group, and sampling the second lane line to obtain a second sampling point group;
respectively determining sampling point distances between each first sampling point in the first sampling point group and adjacent second sampling points in the second sampling point group;
And determining the moving distance of the second lane line based on the obtained distance of each sampling point, and moving the second lane line according to the moving distance.
13. A computer device comprising a processor and a memory, wherein the memory stores program code that, when executed by the processor, causes the processor to perform the steps of the method of any of claims 1 to 6.
14. A computer readable storage medium, characterized in that it comprises a program code for causing a computer device to perform the steps of the method according to any one of claims 1-6, when said program code is run on said computer device.
15. A computer program product comprising computer instructions which, when executed by a processor, implement the steps of the method of any one of claims 1 to 6.
CN202211397255.7A 2022-11-09 2022-11-09 Vehicle position correction method, device, equipment and storage medium Pending CN118010038A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211397255.7A CN118010038A (en) 2022-11-09 2022-11-09 Vehicle position correction method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211397255.7A CN118010038A (en) 2022-11-09 2022-11-09 Vehicle position correction method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN118010038A true CN118010038A (en) 2024-05-10

Family

ID=90954837

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211397255.7A Pending CN118010038A (en) 2022-11-09 2022-11-09 Vehicle position correction method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN118010038A (en)

Similar Documents

Publication Publication Date Title
CN112284400B (en) Vehicle positioning method and device, electronic equipment and computer readable storage medium
CN114111774B (en) Vehicle positioning method, system, equipment and computer readable storage medium
CN113887400B (en) Obstacle detection method, model training method and device and automatic driving vehicle
de Paula Veronese et al. Evaluating the limits of a LiDAR for an autonomous driving localization
CN115824235B (en) Lane positioning method, lane positioning device, computer equipment and readable storage medium
CN111121755B (en) Multi-sensor fusion positioning method, device, equipment and storage medium
CN115164936A (en) Global pose correction method and device for point cloud splicing in high-precision map manufacturing
CN112699765A (en) Method and device for evaluating visual positioning algorithm, electronic equipment and storage medium
CN110068323B (en) Network time delay positioning error compensation method and device and electronic equipment
CN114018269B (en) Positioning method, positioning device, electronic equipment, storage medium and automatic driving vehicle
CN113177980B (en) Target object speed determining method and device for automatic driving and electronic equipment
CN115900697B (en) Object motion trail information processing method, electronic equipment and automatic driving vehicle
CN115512336B (en) Vehicle positioning method and device based on street lamp light source and electronic equipment
CN113758492A (en) Map detection method and device
CN112987707A (en) Automatic driving control method and device for vehicle
CN113902047B (en) Image element matching method, device, equipment and storage medium
CN115792985A (en) Vehicle positioning method and device, electronic equipment, storage medium and vehicle
CN118010038A (en) Vehicle position correction method, device, equipment and storage medium
CN114088104B (en) Map generation method under automatic driving scene
CN112595330B (en) Vehicle positioning method and device, electronic equipment and computer readable medium
CN114993317A (en) Indoor and outdoor seamless positioning method based on multi-source fusion
Zhou et al. Hardware and software design of BMW system for multi-floor localization
CN117091596B (en) Gesture information acquisition method and related equipment
CN113923774B (en) Target terminal position determining method and device, storage medium and electronic equipment
CN111461982B (en) Method and apparatus for splice point cloud

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication