CN115307642A - Method and device for determining vehicle position and attitude data and electronic equipment - Google Patents

Method and device for determining vehicle position and attitude data and electronic equipment Download PDF

Info

Publication number
CN115307642A
CN115307642A CN202210933228.0A CN202210933228A CN115307642A CN 115307642 A CN115307642 A CN 115307642A CN 202210933228 A CN202210933228 A CN 202210933228A CN 115307642 A CN115307642 A CN 115307642A
Authority
CN
China
Prior art keywords
value
time
vehicle
moment
error
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210933228.0A
Other languages
Chinese (zh)
Inventor
杨鹏斌
宋明亮
张建旭
倪菲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202210933228.0A priority Critical patent/CN115307642A/en
Publication of CN115307642A publication Critical patent/CN115307642A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses a method and a device for determining vehicle position and attitude data and electronic equipment, relates to the technical field of computers, and particularly relates to the fields of artificial intelligence such as automatic driving, internet of vehicles, intelligent cabins and computer vision. The specific implementation scheme is as follows: acquiring first position posture data of a vehicle at a first moment, a first predicted value of a first relative state of the vehicle at a second moment relative to the first moment, and a first error value of an error variable corresponding to the first relative state; wherein the first time is earlier than the second time; determining a first estimated value of the first relative state according to the first error value and the first predicted value; and determining second position and attitude data of the vehicle at a second moment according to the first position and attitude data and the first estimation value. According to the method, the position and attitude data of the vehicle at the second moment are determined based on the error value of the error variable of the relative state between different moments, so that the influence of map projection errors is avoided, and the accuracy of the position and attitude data is improved.

Description

Method and device for determining vehicle position and attitude data and electronic equipment
Technical Field
The application relates to the technical field of computers, in particular to artificial intelligence fields such as automatic driving, internet of vehicles, intelligent cabins and computer vision, and specifically relates to a method and a device for determining vehicle position and attitude data and electronic equipment.
Background
With the development of the modern automobile industry, the intelligent driving system becomes the development trend of the current automobile industry, people have higher and higher requirements on the intelligent driving system, most of the intelligent driving systems need feedback based on the position and the posture of the vehicle, and the accuracy of the position and the posture of the vehicle plays a crucial role in the intelligent driving system.
Therefore, how to improve the accuracy of the determined vehicle position and posture data is an urgent technical problem to be solved.
Disclosure of Invention
The application provides a method and a device for determining vehicle position and attitude data and electronic equipment. The specific scheme is as follows:
according to an aspect of the present application, there is provided a method of determining vehicle position and orientation data, including:
acquiring first position posture data of a vehicle at a first moment, a first predicted value of a first relative state of the vehicle at a second moment relative to the first moment, and a first error value of an error variable corresponding to the first relative state; wherein the first time is earlier than the second time;
determining a first estimated value of the first relative state according to the first error value and the first predicted value;
and determining second position and posture data of the vehicle at a second moment according to the first position and posture data and the first estimation value.
According to another aspect of the present application, there is provided a vehicle position and orientation data determination apparatus including:
the acquisition module is used for acquiring first position and attitude data of a vehicle at a first moment, a first predicted value of a first relative state of the vehicle at a second moment relative to the first moment and a first error value of an error variable corresponding to the first relative state; wherein the first moment is earlier than the second moment;
a first determination module for determining a first estimate of the first relative state based on the first error value and the first predicted value;
and the second determining module is used for determining second position and posture data of the vehicle at a second moment according to the first position and posture data and the first estimation value.
According to another aspect of the present application, there is provided an electronic device including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method of the above embodiments.
According to another aspect of the present application, there is provided a non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method according to the above-described embodiments.
According to another aspect of the present application, a computer program product is provided, comprising a computer program which, when being executed by a processor, carries out the steps of the method of the above-mentioned embodiment.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present application, nor do they limit the scope of the present application. Other features of the present application will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
fig. 1 is a schematic flowchart of a method for determining vehicle position and orientation data according to an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating a variation of a body coordinate system corresponding to a vehicle motion provided by an embodiment of the present application;
FIG. 3 is a schematic flow chart illustrating a method for determining vehicle position and orientation data according to another embodiment of the present application;
fig. 4 is a schematic flowchart of a method for determining vehicle position and attitude data according to another embodiment of the present application;
fig. 5 is a schematic diagram of lane lines at different times according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a device for determining vehicle position and orientation data according to an embodiment of the present application;
fig. 7 is a block diagram of an electronic device for implementing a method for determining vehicle position and orientation data according to an embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application to assist in understanding, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Artificial intelligence is a subject of research that uses computers to simulate some human thinking processes and intelligent behaviors (such as learning, reasoning, thinking, planning, etc.), and has both hardware-level and software-level technologies. Artificial intelligence hardware technologies generally include technologies such as sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, big data processing, and the like; the artificial intelligence software technology comprises a computer vision technology, a voice recognition technology, a natural language processing technology, deep learning, a big data processing technology, a knowledge map technology and the like.
Automatic driving, also known as unmanned driving, computer driving or wheeled mobile robots, is a leading-edge technology that relies on computer and artificial intelligence techniques to accomplish complete, safe and effective driving without manual manipulation.
The concept of the internet of vehicles is derived from the internet of things, namely the internet of vehicles, the network connection between vehicles and X (namely the vehicles, people, roads and service platforms) is realized by taking the vehicles in driving as information perception objects and by means of a new generation of information communication technology, the overall intelligent driving level of the vehicles is improved, safe, comfortable, intelligent and efficient driving feeling and traffic service are provided for users, meanwhile, the traffic operation efficiency is improved, and the intelligent level of social traffic service is improved.
The intelligent cabin is used for transforming the riding space in the vehicle, so that the driving and riding experience can be more comfortable and intelligent.
Computer vision is a science for researching how to make a machine "see", and means that a camera and a computer are used to replace human eyes to perform machine vision such as identification, tracking and measurement on a target, and further image processing is performed, so that the computer processing becomes an image more suitable for human eyes to observe or transmitted to an instrument to detect.
A determination method, an apparatus, an electronic device, and a storage medium of vehicle position and orientation data of an embodiment of the present application are described below with reference to the drawings.
Fig. 1 is a schematic flowchart of a method for determining vehicle position and orientation data according to an embodiment of the present disclosure.
The method for determining vehicle position and orientation data according to the embodiment of the present application may be performed by the apparatus for determining vehicle position and orientation data according to the embodiment of the present application, and the apparatus may be configured in an electronic device to determine the vehicle position and orientation data based on an error value of a state variable between different times.
The electronic device may be any device with computing capability, for example, a personal computer, a mobile terminal, a server, and the like, and the mobile terminal may be a hardware device with various operating systems, touch screens, and/or display screens, such as an in-vehicle device, a mobile phone, a tablet computer, a personal digital assistant, a wearable device, and the like.
As shown in fig. 1, the method for determining vehicle position and orientation data includes:
step 101, first position and posture data of a vehicle at a first moment, a first predicted value of a first relative state of the vehicle at a second moment relative to the first moment, and a first error value of an error variable corresponding to the first relative state are obtained.
The first time is earlier than the second time, for example, the second time may be a current time, and the second time may be a previous time of the current time.
In this application, the first position and orientation data may be position and orientation data of the vehicle at the first time in a world coordinate system, and the first predicted value of the first relative state may be obtained based on a result of measurement and prediction of the target sensor on the vehicle at the second time, for example, may be obtained based on a predicted value of the relative state of the first time with respect to the previous time and a measured value of the inertial measurement unit.
In the present application, the first error value of the error variable corresponding to the first relative state may be obtained through filter estimation, for example, the first error value may be obtained by updating a vehicle speed observation acquisition filter, or may be obtained by updating a lane line observation filter.
In practical application, the frequency of the measured vehicle speed and the frequency of the perceived lane line may be different, so that in the application, when the vehicle speed is obtained, a first error value can be obtained by updating according to vehicle speed observation; when the lane line equation is obtained, the first error value can be obtained by updating according to the lane line observation.
Step 102, determining a first estimated value of the first relative state according to the first error value and the first predicted value.
In the application, the first predicted value may be corrected by using the first error value to obtain a first estimated value of the first relative state.
For example, the first position and orientation data is position and orientation data of the vehicle at a first time k in a world coordinate system, and the first position and orientation data of the vehicle at the time k can be represented by the following formula:
Figure BDA0003782554130000031
wherein the content of the first and second substances,
Figure BDA0003782554130000032
the position of the vehicle k at the moment in time under the world coordinate system w,
Figure BDA0003782554130000033
is the attitude (in quaternion) of the vehicle k at time in the world coordinate system w.
The relative motion of vehicle k +1 with respect to time k can be expressed as:
Figure BDA0003782554130000034
wherein the content of the first and second substances,
Figure BDA0003782554130000035
the relative displacement, the relative speed and the relative attitude of the vehicle k +1 relative to the k moment are respectively; b a And b g The bias of the accelerometer and gyroscope respectively,
Figure BDA0003782554130000036
at body coordinate system b for time k k The lower gravity vector. And define
Figure BDA0003782554130000037
The error variable δ x of (1) is:
δx=[δp δv δθ δb a δb g δg] T (3)
wherein, δ p, δ v, δ θ, δ b a 、δb g And delta g respectively represent displacement error, speed error, attitude error, accelerometer bias error, gyroscope bias error and gravity vector error under vehicle coordinates.
Based on δ x and predicted values
Figure BDA0003782554130000038
Obtained by the following formula
Figure BDA0003782554130000039
Figure BDA00037825541300000310
Wherein the content of the first and second substances,
Figure BDA00037825541300000311
is a first estimate of the first relative motion of the vehicle at time k +1 with respect to time k,
Figure BDA00037825541300000312
is a first predictor of a first relative motion,
Figure BDA00037825541300000313
respectively a relative displacement predicted value, a relative speed predicted value, a relative attitude predicted value, an accelerometer bias predicted value, a gyroscope bias predicted value and a vehicle body coordinate system b at the moment of k +1 relative to the moment of k k And (4) predicting the lower gravity vector.
For the convenience of understanding, fig. 2 is a schematic diagram of a variation of a body coordinate system corresponding to vehicle motion provided by an embodiment of the present application. Fig. 2 (a) shows the movement path of the vehicle from the time k to the time k +1, and fig. 2 (b) shows the body coordinate system b at the time k k And the body coordinate system b at the time of k +1 k+1 And the coordinate system of the vehicle body is represented by three axes of a three-dimensional coordinate system, namely x, y and z.
Based on the above, the state equation may be:
Figure BDA0003782554130000041
wherein the content of the first and second substances,
Figure BDA0003782554130000042
representing a predicted value representing an error variable; f and G are described in the following equations (6) and (7); n is the noise of the accelerometers and gyroscopes of the inertial measurement unit, as described in equation (8); the prediction of the state variables can be made based on equation (5).
Figure BDA0003782554130000043
Figure BDA0003782554130000044
Figure BDA0003782554130000045
Wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003782554130000046
body coordinate system b representing attitude at time t relative to time k k A rotation matrix of t Representing the measurement value (a) of the accelerometer of the inertial measurement unit at time t t Is 3-dimensional data); b is a mixture of a Representing the measured value of the gyroscope of the inertial measurement unit at time t (b) a Is 3-dimensional data); n is a radical of an alkyl radical a 、n g 、n ba And n bg All three-dimensional vectors are preset; n is a Gaussian noise representing the accelerometer of the inertial measurement unit; n is g Gaussian noise representing the gyroscope of the inertial measurement unit; n is ba Random walk noise representing an accelerometer of the inertial measurement unit; n is bg Representing the random walk noise of the gyroscopes of the inertial measurement units.
And 103, determining second position and posture data of the vehicle at a second moment according to the first position and posture data and the first estimation value.
In this application, after the first estimation value of the first relative motion is determined, the first position and orientation data of the vehicle at the first time may be corrected by using the first estimation value, so as to obtain the second position and orientation data of the vehicle at the second time.
In the present application, the first position and orientation data and the second position and orientation data may be position and orientation data in the same coordinate system, for example, both the first position and orientation data and the second position and orientation data may be position and orientation data in a world coordinate system, and then the second position and orientation data of the vehicle in the world coordinate system at the current time may be determined based on the first position and orientation data of the vehicle in the world coordinate system at the previous time and the first estimated value of the first relative state of the vehicle at the current time with respect to the previous time.
In the embodiment of the application, a first error value of an error variable of a first relative state of a second moment relative to a first moment is obtained, a first estimation value of the first relative state is determined according to the first error value and a first prediction value of the first relative state, and second position and posture data of a vehicle at the second moment are determined according to the first estimation value and first position and posture data of the vehicle at the first moment. Therefore, the position and posture data of the vehicle at the second moment are determined by the error value of the error variable based on the relative state between different moments, so that the influence of map projection errors is avoided, and the accuracy of the position and posture data is improved.
Fig. 3 is a schematic flowchart of a method for determining vehicle position and orientation data according to another embodiment of the present application.
As shown in fig. 3, the method for determining vehicle position and orientation data includes:
step 301, first position and posture data of the vehicle at a first time and a first predicted value of a first relative state of the vehicle at a second time relative to the first time are obtained.
The description of the first position and orientation data may refer to the above embodiments, and therefore, the description thereof is omitted here.
According to the method and the device, a second estimated value of a second relative state of the vehicle at a first time relative to a third time and a measured value of the inertia measurement unit at the second time can be obtained, and a first prediction of the first relative state is obtained according to the second estimated value of the second relative state and the measured value of the inertia measurement unit at the second time.
The third time is earlier than the first time, and the first time is earlier than the second time. For example, if the first time is k, the third time may be k-1 time, and the second time is k +1 time, then a second estimated value of the second relative state of k time with respect to k-1 time and a measured value of the inertial measurement unit at k +1 time may be obtained, so as to obtain a first predicted value of the first relative state of k +1 time with respect to k time.
Thus, by using the estimated value of the second relative state at the first time and the measured value of the inertial measurement unit, the prediction of the relative state of the second time with respect to the first time can be realized.
Step 302, a vehicle speed observation matrix at a second moment, an estimated value of a covariance matrix between the position and the attitude of the vehicle at the first moment, and a second error value of an error variable corresponding to a second relative state of the vehicle at the first moment relative to a third moment are obtained.
The third time is earlier than the first time, and the first time is earlier than the second time.
Due to the nature of the lane line feature itself (parallel strands), motion in the direction of the lane line can be made insignificant, thereby introducing an observation of longitudinal motion, and in addition to the arrangement of the actual vehicle, the vehicle will typically have an observation of vehicle speed, so vehicle speed can be introduced as a longitudinal observation.
In the application, a first speed of the vehicle in the longitudinal direction under the body coordinate system at the second moment and a second speed of the first speed under the body coordinate system at the first moment can be obtained through measurement of the sensor at the second moment. Since the first speed and the second speed are not in the same coordinate system, the vehicle speed observation matrix can be obtained according to the first vehicle speed, the second speed and a rotation matrix from the second speed to the first speed. Therefore, the vehicle speed observation matrix is obtained according to the observation speed of the vehicle in the longitudinal direction of the vehicle body coordinate system, so that the error value of the error variable can be updated based on the vehicle speed observation matrix, and the estimation of the error variable is realized.
For example, the first speed of the vehicle in the longitudinal direction under the body coordinate system at the time k +1 is V speed And the running speed of the vehicle under the body coordinate system at the moment k +1 can be expressed as follows:
V bk+1 =[V speed 0 0] (9)
then there are:
Figure BDA0003782554130000051
wherein the content of the first and second substances,
Figure BDA0003782554130000052
indicating the speed of the vehicle body at the moment k +1Marker series b k+1 The error in the observation of (a) below,
Figure BDA0003782554130000053
a rotation matrix representing the second speed to the first speed,
Figure BDA0003782554130000054
and a second speed of the first speed in the vehicle body coordinate system at the first moment is shown. Wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003782554130000055
may be based on the amount of change in the vehicle attitude at time k relative to time k + 1.
Then, the vehicle speed observation matrix S at the time k +1 is:
Figure BDA0003782554130000056
wherein x represents an anti-symmetric matrix, i.e.
Figure BDA0003782554130000057
Arranged in an anti-symmetric matrix.
In the present application, the second error value of the error variable corresponding to the second relative state of the vehicle at the first time relative to the third time may be obtained by updating with a filter based on the vehicle speed observation matrix or the lane line observation matrix at the first time.
Step 303, determining a predicted value of the second time covariance matrix according to the estimated value of the first time covariance matrix.
Assuming that the first time is k and the second time is k +1, the predicted value of the covariance matrix at the time k +1 can be determined by using the following formula
Figure BDA0003782554130000061
Figure BDA0003782554130000062
Wherein, P k The estimated value of the covariance matrix at the moment k is shown, and I is an identity matrix; f and G are matrices, which can be given by the following equations (6) and (7); q is a noise matrix, which can be given by the above equation (8); Δ t represents a time interval (scalar) that can be considered as the measurement interval of the inertial measurement unit.
And step 304, updating the second error value according to the predicted values of the vehicle speed observation matrix and the covariance matrix to determine a first error value.
In the application, the filter gain matrix at the second moment can be determined according to the predicted values of the vehicle speed observation matrix and the covariance matrix, and then the second error value is updated according to the filter gain matrix at the second moment and the first longitudinal speed of the vehicle under the vehicle body coordinate system at the second moment so as to determine the first error value. Therefore, the error value at the first moment can be updated to obtain the error value at the second moment by observing the matrix according to the vehicle speed and the covariance matrix between the position and the posture.
Suppose that the first time is k, the second time is k +1, and the second error value of the error variable corresponding to the relative state at the time k is δ x k The first error value of the error variable corresponding to the relative state at the time k +1 is δ x k+1 For vehicle speed observation, the error state kalman filtering may be used for updating, and the updating formula is as follows:
Figure BDA0003782554130000063
Figure BDA0003782554130000064
Figure BDA0003782554130000065
wherein, K k+1 A Kalman filtering gain matrix representing the k +1 moment; s k+1 Represents the k +1 timeThe vehicle speed observation matrix of (1);
Figure BDA0003782554130000066
a predicted value of a rotation matrix representing the second speed to the first speed,
Figure BDA0003782554130000067
a predicted value indicating a second speed of the first speed at a first time in a vehicle body coordinate system; r k+1 Representing a measurement noise matrix, which can be set according to the confidence of measurement; p is k+1 The estimated value of the covariance matrix at the time k +1 can be used for determining the predicted value of the covariance matrix at the time k +2 in the next update.
It should be noted that, in the following description,
Figure BDA0003782554130000068
and
Figure BDA0003782554130000069
the upper horizontal line indicates that the value is the predicted value, and the above
Figure BDA00037825541300000610
And
Figure BDA00037825541300000611
the formula (10) is only the relation of expression quantity.
Step 305 determines a first estimated value of the first relative state based on the first error value and the first predicted value.
Determining the first error value deltax based on the above formula k+1 Then, δ x can be adjusted k+1 And the first predicted value
Figure BDA00037825541300000612
Adding to obtain a first estimated value of the first relative state at the time k +1
Figure BDA00037825541300000613
And step 306, determining second position and posture data of the vehicle at a second moment according to the first position and posture data and the first estimation value.
In the present application, steps 305-306 are similar to those described in the above embodiments, and therefore are not described herein again.
In the embodiment of the application, the vehicle speed observation matrix at the second moment and the estimation value of the covariance matrix between the position and the attitude of the vehicle at the first moment are obtained, the predicted value of the covariance matrix at the second moment is determined according to the estimation value of the covariance matrix at the first moment, the second error value is updated according to the vehicle speed observation matrix and the predicted value of the covariance matrix to determine the first error value, then the first estimation value of the first relative state is determined according to the first error value and the first predicted value, and the position and attitude data of the vehicle at the second moment is determined according to the first estimation value and the position and attitude data of the vehicle at the first moment. Because the vehicle speed detection is usually performed in practical application, the vehicle speed is introduced as longitudinal observation, the error value of the error variable in the relative state is obtained by performing error update on the vehicle speed observation matrix, and the position and posture data of the vehicle is determined by using the error value, so that the application range is wide.
Fig. 4 is a schematic flowchart of a method for determining vehicle position and orientation data according to another embodiment of the present application.
As shown in fig. 4, the method for determining vehicle position and orientation data includes:
step 401, first position and posture data of the vehicle at a first moment and a first predicted value of a first relative state of the vehicle at a second moment relative to the first moment are obtained.
In the present application, step 301 is similar to that described in the above embodiments, and therefore is not described again.
Step 402, a lane line observation matrix at a second time, an estimated value of a covariance matrix between the position and the attitude of the vehicle at the first time, and a second error value of an error variable corresponding to a second relative state of the vehicle at the first time with respect to a third time are obtained.
The third moment is earlier than the first moment, and the first moment is earlier than the second moment.
In the present application, the second error value of the error variable corresponding to the second relative state of the vehicle at the first time relative to the third time may be obtained by updating with a filter based on the vehicle speed observation matrix or the lane line observation matrix at the first time.
In the method, a first lane line equation at a second moment and a second lane line equation at a first moment output by a lane line perception sensor on the vehicle can be obtained, the first lane line equation and the second lane line equation are sampled, and a plurality of first sampling points on the first lane line and a plurality of second sampling points on the second lane line are obtained. The sampling distances of the first lane line equation and the second lane line equation may be different or the same, and the present application does not limit this.
And then, two target second sampling points corresponding to each first sampling point can be determined from the plurality of second sampling points, the distance between the two target second sampling points and the first sampling point is the closest, an observation matrix corresponding to each first sampling point is determined according to the first position corresponding to each first sampling point and the second positions of the two target second sampling points under the vehicle body coordinate system at the first moment, and the observation matrices corresponding to the plurality of first sampling points can be combined to obtain the lane line observation matrix.
Therefore, the lane line equation is introduced as observation, and the motion of the vehicle is restrained in the transverse direction and the heading direction, so that the accuracy of the subsequently determined error value can be improved.
In practical applications, the lane line sensing frequency is generally greater than 10Hz (i.e. the interval is less than 100 milliseconds), and the road surface is generally flat, so that the lane lines in two frames of sensing data of the vehicle can be considered to be in the same plane, that is, the sensing lane line equation of the current frame and the sensing lane line equation of the previous frame are considered to describe the lane line entity in the same plane.
Assuming that the first time is k and the second time is k +1, the observation equation of the lane line can be defined as follows:
Figure BDA0003782554130000071
the physical meaning of the lane line observation equation can be seen in fig. 5, and fig. 5 is a schematic diagram of the lane line at different times provided by the embodiment of the present application.
In FIG. 5, | k A first track line, l, corresponding to the first track line equation k+1 Is a second lane line corresponding to the second lane line equation,
Figure BDA0003782554130000072
are the positions of two points on the first lane line at time k,
Figure BDA0003782554130000073
expressed as the second lane line l at the time k +1 according to the corresponding state prediction value k+1 The upper position is
Figure BDA0003782554130000074
The ith point to the k moment of the vehicle body coordinate system b k The following position, the conversion formula is as follows:
Figure BDA0003782554130000075
wherein the content of the first and second substances,
Figure BDA0003782554130000076
representing the position at the time k +1 relative to the coordinate system b of the body k The rotation matrix of (a);
Figure BDA0003782554130000077
representing the relative displacement of the vehicle k +1 with respect to k at time.
In the present application, l may be first aligned k+1 The lane line equation of (a) is sampled at a preset interval d, and (b) is measured k The lane line equation of (a) can
Figure BDA0003782554130000078
Sampling is performed. Wherein d may be two meters or other values, which is not limited in this application.
Thereafter, can be directed to l according to the above formula k+1 The second sampling point obtained by sampling is transferred to a vehicle body coordinate system b k Next, the position is aligned by using the golden section method
Figure BDA0003782554130000081
At each sample point of k Find a set of points corresponding to the first sampling point
Figure BDA0003782554130000082
And
Figure BDA0003782554130000083
the residual calculation can be performed using the above equation (16).
If l k+1 And n sampling points are provided, and a corresponding vehicle speed observation matrix H can be obtained:
H=[H 1 …H i …H n ] T (18)
Figure BDA0003782554130000084
Figure BDA0003782554130000085
Figure BDA0003782554130000086
wherein H i An observation matrix representing the ith sample point;
Figure BDA0003782554130000087
represents the pair of f (i.e., the above equation (16)))
Figure BDA0003782554130000088
Derivation is carried out;
Figure BDA0003782554130000089
to represent
Figure BDA00037825541300000810
D, derivation of the delta x; j is a unit of r (θ) represents a right jacobian matrix.
And step 403, determining a predicted value of the covariance matrix at the second moment according to the estimated value of the covariance matrix at the first moment.
In the present application, step 403 is similar to that described in the above embodiments, and therefore is not described herein again.
In step 404, the second error value is updated according to the predicted values of the lane line observation matrix and the covariance matrix to determine a first error value.
In the application, a lane line observation matrix corresponding to the jth update and a third error value of the error variable of the first relative state obtained by the jth update can be obtained. When j is a natural number and j is 0, the jth updated corresponding lane line observation matrix may be a lane line observation matrix obtained at the second time, and the jth updated third error value of the error variable in the first relative state may be a second error value of the error variable in the second relative state of the vehicle at the first time relative to the third time. Wherein the third time is earlier than the first time.
And then, determining a filter gain matrix obtained by updating the jth time according to the predicted value of the covariance matrix and the lane line observation matrix corresponding to updating the jth time, determining an updating error corresponding to updating the jth time by updating the jth +1 th time according to the filter gain matrix obtained by updating the jth time, the lane line observation matrix corresponding to updating the jth time and a third error value obtained by updating the jth time, and correcting the third error value obtained by updating the jth time by using the updating error corresponding to updating the jth +1 th time so as to determine the third error value obtained by updating the jth +1 th time.
After determining the third error value obtained by the j +1 th update, determining the lane line observation value corresponding to the j +1 th update according to the formula (2), and if the lane line observation value corresponding to the j +1 th update is smaller than a preset threshold, determining the third error value obtained by the j +1 th update as the second error value. If the lane line observation value corresponding to the (j + 1) th update is greater than or equal to the preset threshold, the (j + 2) th update can be performed on the third error value of the error variable in the first relative state according to the predicted value of the covariance matrix and the lane line observation matrix corresponding to the (j + 1) th update until the lane line observation value is less than the preset threshold, and the third error value obtained by the last update is used as the second error value.
Suppose that the first time is k, and the second time is k +1, and the second error value of the error variable corresponding to the relative state at the time k is deltax k The first error value of the error variable corresponding to the relative state at the time k +1 is δ x k+1 Because the lane line equation is observed with higher nonlinear degree, the state estimation can be carried out in the form of iterative error state Kalman filtering, the precision of the lane line equation under the strong nonlinear problem can be improved, and the updating formula is as follows:
Figure BDA00037825541300000811
Figure BDA0003782554130000091
δx k+1,j+1 =δx k+1,j +Δx j+1 (24)
Figure BDA0003782554130000092
wherein, K k+1,j Representing a Kalman filtering gain matrix obtained by j-th updating;
Figure BDA0003782554130000093
a predicted value of a covariance matrix at the time k +1 is represented; h k+1,j Representing the lane line observation matrix obtained by the jth update; r is k+1,j Representing the noise matrix obtained by the j-th update; Δ x j+1 Represents the update error of the j +1 th update relative to the j update; δ x k+1,j Representing a third error value obtained by the j-th updating; p is k+1 Represents the time k +1An estimate of the covariance matrix.
The above-mentioned updating formulas (23) - (24) may be repeated for a plurality of times until the lane line observation value is smaller than the preset threshold, and a third error value obtained by the last updating is used as the first error value.
In the above equation (25), the estimated value P of the covariance matrix at the time k +1 k+1 Is a prediction value using a covariance matrix at the time k +1
Figure BDA0003782554130000094
Kalman filter gain matrix K obtained by the last update (mth update) k+1,m Lane line observation matrix H k+1,m And a noise matrix R k+1,m Obtained of k+1 Can be used to determine the predicted value of the covariance matrix at time k +2 at the next update.
In addition, after the update is completed, the state variables at the next time can be reinitialized
Figure BDA0003782554130000095
Comprises the following steps:
Figure BDA0003782554130000096
among them are:
Figure BDA0003782554130000097
Figure BDA0003782554130000098
wherein the content of the first and second substances,
Figure BDA0003782554130000099
the speed of the vehicle at the moment k +1 in a vehicle body coordinate system is represented;
Figure BDA00037825541300000910
indicating that k +1 is in the vehicleBody coordinate system b k+1 The lower gravity vector.
Since the coordinate system is updated to the body coordinate system at the time of k +1, the attitude of the vehicle can be determined as the unit quaternion q 0 The position is 0 and the covariance matrix between the position and the pose is reinitialized.
According to the method and the device, iterative updating is carried out on the error value by utilizing the lane line observation matrix until the lane line observation value is smaller than the preset threshold value, the error value obtained by the last updating is used as the first error value, and the accuracy of the first error value is improved.
Step 405 determines a first estimate of the first relative state based on the first error value and the first predicted value.
And 406, determining second position and posture data of the vehicle at a second moment according to the first position and posture data and the first estimation value.
In the present application, steps 405 to 406 are similar to those described in the above embodiments, and therefore are not described herein again.
In the embodiment of the application, the second error value of the error variable of the second relative state of the first time relative to the third time is updated by utilizing the lane line observation matrix to obtain the first error value of the error variable of the first relative state of the second time relative to the first time, so that the motion of the vehicle is restrained in the transverse direction and the course direction by introducing the lane line equation as observation, the error is updated based on the vehicle speed observation matrix to obtain the error value of the error variable of the relative state, the accuracy of the error value can be improved, and the accuracy of the determined vehicle position posture data can be improved.
In order to implement the above embodiments, the present application further provides a device for determining vehicle position and orientation data. Fig. 6 is a schematic structural diagram of a device for determining vehicle position and orientation data according to an embodiment of the present application.
As shown in fig. 6, the vehicle position and orientation data determination device 600 includes:
the obtaining module 610 is configured to obtain first position and attitude data of a vehicle at a first time, a first predicted value of a first relative state of the vehicle at a second time relative to the first time, and a first error value of an error variable corresponding to the first relative state; wherein the first time is earlier than the second time;
a first determining module 620, configured to determine a first estimated value of the first relative state according to the first error value and the first predicted value;
and a second determining module 630, configured to determine second position and orientation data of the vehicle at a second time according to the first position and orientation data and the first estimation value.
In a possible implementation manner of this embodiment of the present application, the obtaining module 610 includes:
the first acquisition unit is used for acquiring a vehicle speed observation matrix at a second moment, an estimated value of a covariance matrix between the position and the posture of the vehicle at the first moment and a second error value of an error variable corresponding to a second relative state of the vehicle at the first moment relative to a third moment; wherein the third time is earlier than the first time;
the first determining unit is used for determining a predicted value of the covariance matrix at the second moment according to the estimated value of the covariance matrix at the first moment;
and the first updating unit is used for updating the second error value according to the predicted values of the vehicle speed observation matrix and the covariance matrix so as to determine a first error value.
In a possible implementation manner of the embodiment of the present application, the first updating unit is configured to:
determining a filter gain matrix at the second moment according to the vehicle speed observation matrix and the predicted value of the covariance matrix;
and updating the second error value according to the filter gain matrix and the first speed of the vehicle in the longitudinal direction under the vehicle body coordinate system at the second moment so as to determine the first error value.
In a possible implementation manner of the embodiment of the present application, the first obtaining unit is configured to:
acquiring a first speed of the vehicle in the longitudinal direction under the vehicle body coordinate system at a second moment and a second speed of the first speed under the vehicle body coordinate system at the first moment;
and acquiring a vehicle speed observation matrix according to the first vehicle speed, the second speed and a rotation matrix from the second speed to the first speed.
In a possible implementation manner of this embodiment of the present application, the obtaining module 610 includes:
a second obtaining unit, configured to obtain a lane line observation matrix at a second time, an estimated value of a covariance matrix between a position and an attitude of the vehicle at the first time, and a second error value of an error variable corresponding to a second relative state of the vehicle at the first time with respect to a third time; wherein the third time is earlier than the first time;
the second determining unit is used for determining a predicted value of the covariance matrix at the second moment according to the estimated value of the covariance matrix at the first moment;
and the second updating unit is used for updating the second error value according to the predicted values of the lane line observation matrix and the covariance matrix so as to determine the first error value.
In a possible implementation manner of the embodiment of the present application, the second updating unit is configured to:
acquiring a lane line observation matrix corresponding to the jth update and a third error value of the error variable of the first relative state obtained by the jth update; wherein j is a natural number;
determining a filter gain matrix obtained by the jth update according to the predicted value of the covariance matrix and the corresponding lane line observation matrix updated at the jth time;
determining an updating error corresponding to the j +1 th updating relative to the j th updating according to the filtering gain matrix obtained by the j th updating, the lane line observation matrix corresponding to the j th updating and a third error value obtained by the j th updating;
determining a third error value obtained by the j +1 th update according to the update error corresponding to the j +1 th update and the third error value corresponding to the j +1 th update;
determining the lane line observation value corresponding to the j +1 th update;
determining a third error value obtained by updating the (j + 1) th time as a second error value under the condition that the lane line observed value corresponding to the (j + 1) th time of updating is smaller than a preset threshold value;
and under the condition that the lane line observation value corresponding to the j +1 th update is larger than or equal to the preset threshold, updating the third error value of the error variable in the first relative state for the j +2 th update according to the predicted value of the covariance matrix and the lane line observation matrix corresponding to the j +1 th update until the lane line observation value is smaller than the preset threshold, and taking the updated third error value as the second error value. .
In a possible implementation manner of the embodiment of the present application, the second obtaining unit is configured to:
acquiring a first lane line equation at a second moment and a second lane line equation at a first moment;
respectively sampling a first lane line equation and a second lane line equation to obtain a plurality of first sampling points on a first lane line and a plurality of second sampling points on a second lane line;
determining two target second sampling points corresponding to each first sampling point from the plurality of second sampling points;
determining a first position of each first sampling point in a vehicle body coordinate system at a first moment;
determining an observation matrix corresponding to each first sampling point according to a first position corresponding to each first sampling point and second positions of two target second sampling points in a vehicle body coordinate system at the first moment;
and determining a lane line observation matrix according to the observation matrixes respectively corresponding to the plurality of first sampling points.
In a possible implementation manner of the embodiment of the present application, the obtaining module 610 is configured to:
acquiring a second estimated value of a second relative state of the vehicle at a first moment relative to a third moment and a measured value of an inertia measuring unit at the second moment; wherein the third time is earlier than the first time;
and acquiring a first predicted value of the first relative state according to the second estimated value and the measured value of the second relative state.
It should be noted that the explanation of the embodiment of the method for determining vehicle position and orientation data is also applicable to the device for determining vehicle position and orientation data of this embodiment, and therefore is not described herein again.
In the embodiment of the application, a first error value of an error variable of a first relative state of a second moment relative to a first moment is obtained, a first estimation value of the first relative state is determined according to the first error value and a first prediction value of the first relative state, and second position and posture data of a vehicle at the second moment is determined according to the first estimation value and first position and posture data of the vehicle at the first moment. Therefore, the position and posture data of the vehicle at the second moment are determined by the error value of the error variable based on the relative state between different moments, so that the influence of map projection errors is avoided, and the accuracy of the position and posture data is improved.
There is also provided, in accordance with an embodiment of the present application, an electronic device, a readable storage medium, and a computer program product.
FIG. 7 illustrates a schematic block diagram of an example electronic device 700 that can be used to implement embodiments of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic devices may also represent various forms of mobile devices, such as personal digital processors, cellular telephones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 7, the device 700 includes a computing unit 701, which can perform various appropriate actions and processes in accordance with a computer program stored in a ROM (Read-Only Memory) 702 or a computer program loaded from a storage unit 708 into a RAM (Random Access Memory) 703. In the RAM 703, various programs and data required for the operation of the device 700 can also be stored. The computing unit 701, the ROM 702, and the RAM 703 are connected to each other by a bus 704. An I/O (Input/Output) interface 705 is also connected to the bus 704.
Various components in the device 700 are connected to the I/O interface 705, including: an input unit 706 such as a keyboard, a mouse, or the like; an output unit 707 such as various types of displays, speakers, and the like; a storage unit 708 such as a magnetic disk, optical disk, or the like; and a communication unit 709 such as a network card, modem, wireless communication transceiver, etc. The communication unit 709 allows the device 700 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
Computing unit 701 may be a variety of general purpose and/or special purpose processing components with processing and computing capabilities. Some examples of the computing Unit 701 include, but are not limited to, a CPU (Central Processing Unit), a GPU (graphics Processing Unit), various dedicated AI (Artificial Intelligence) computing chips, various computing Units running machine learning model algorithms, a DSP (Digital Signal Processor), and any suitable Processor, controller, microcontroller, and the like. The calculation unit 701 executes the respective methods and processes described above, such as the determination method of the vehicle position and orientation data. For example, in some embodiments, the method of determining vehicle position and orientation data may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 708. In some embodiments, part or all of a computer program may be loaded onto and/or installed onto device 700 via ROM 702 and/or communications unit 709. When the computer program is loaded into the RAM 703 and executed by the computing unit 701, one or more steps of the above-described determination method of vehicle position and orientation data may be performed. Alternatively, in other embodiments, the computing unit 701 may be configured by any other suitable means (e.g., by means of firmware) to perform the determination method of the vehicle position and orientation data.
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, FPGAs (Field Programmable Gate arrays), ASICs (Application-Specific Integrated circuits), ASSPs (Application Specific Standard products), SOCs (System On Chip), CPLDs (Complex Programmable Logic devices), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present application may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program code, when executed by the processor or controller, causes the functions/acts specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this application, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a RAM, a ROM, an EPROM (erasable Programmable Read-Only-Memory) or flash Memory, an optical fiber, a CD-ROM (Compact Disc Read-Only-Memory), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a Display device (e.g., a CRT (Cathode Ray Tube) or LCD (Liquid Crystal Display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: LAN (Local Area Network), WAN (Wide Area Network), internet, and blockchain Network.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The Server may be a cloud Server, which is also called a cloud computing Server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service expansibility in a conventional physical host and a VPS (Virtual Private Server). The server may also be a server of a distributed system, or a server incorporating a blockchain.
According to an embodiment of the present application, there is also provided a computer program product, which when executed by an instruction processor in the computer program product, performs the method for determining vehicle position and orientation data proposed in the above-mentioned embodiment of the present application.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present application can be achieved, and the present invention is not limited herein.
The above-described embodiments are not intended to limit the scope of the present disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (19)

1. A method of determining vehicle position and attitude data, comprising:
acquiring first position posture data of a vehicle at a first moment, a first predicted value of a first relative state of the vehicle at a second moment relative to the first moment, and a first error value of an error variable corresponding to the first relative state; wherein the first time is earlier than the second time;
determining a first estimated value of the first relative state according to the first error value and the first predicted value;
and determining second position and posture data of the vehicle at the second moment according to the first position and posture data and the first estimation value.
2. The method of claim 1, wherein said obtaining a first error value for an error variable corresponding to the first relative state comprises:
acquiring a vehicle speed observation matrix at the second moment, an estimated value of a covariance matrix between the position and the attitude of the vehicle at the first moment, and a second error value of an error variable corresponding to a second relative state of the vehicle at the first moment relative to a third moment; wherein the third time is earlier than the first time;
determining a predicted value of the covariance matrix at the second moment according to the estimated value of the covariance matrix at the first moment;
and updating the second error value according to the predicted values of the vehicle speed observation matrix and the covariance matrix to determine the first error value.
3. The method of claim 2, wherein the updating the second error value based on the vehicle speed observation matrix and a predicted value of the covariance matrix to determine the first error value comprises:
determining a filter gain matrix at the second moment according to the vehicle speed observation matrix and the predicted value of the covariance matrix;
and updating the second error value according to the filter gain matrix and the first speed of the vehicle in the longitudinal direction under the vehicle body coordinate system at the second moment so as to determine the first error value.
4. The method of claim 2, wherein said obtaining a vehicle speed observation matrix at a second time comprises:
acquiring a first speed of the vehicle in the longitudinal direction under a vehicle body coordinate system at the second moment and a second speed of the first speed under the vehicle body coordinate system at the first moment;
and acquiring the vehicle speed observation matrix according to the first vehicle speed, the second speed and the rotation matrix from the second speed to the first speed.
5. The method of claim 1, wherein said obtaining a first error value for an error variable corresponding to the first relative state comprises:
acquiring a lane line observation matrix at the second moment, an estimated value of a covariance matrix between the position and the posture of the vehicle at the first moment, and a second error value of an error variable corresponding to a second relative state of the vehicle at the first moment relative to a third moment; wherein the third time is earlier than the first time;
determining a predicted value of the covariance matrix at the second moment according to the estimated value of the covariance matrix at the first moment;
and updating the second error value according to the lane line observation matrix and the predicted value of the covariance matrix to determine the first error value.
6. The method of claim 5, wherein the updating the second error value based on the lane line observation matrix and the predicted value of the covariance matrix to determine the first error value comprises:
acquiring a lane line observation matrix corresponding to the jth update and a third error value of the error variable of the first relative state obtained by the jth update; wherein j is a natural number;
determining a filter gain matrix obtained by the jth update according to the predicted value of the covariance matrix and the corresponding lane line observation matrix updated at the jth time;
determining an updating error corresponding to the j +1 th updating relative to the j th updating according to the filtering gain matrix obtained by the j th updating, the lane line observation matrix corresponding to the j th updating and a third error value obtained by the j th updating;
determining a third error value obtained by the j +1 th update according to the update error corresponding to the j +1 th update and the third error value corresponding to the j +1 th update;
determining the lane line observation value corresponding to the j +1 th update;
determining a third error value obtained by updating the (j + 1) th time as the second error value under the condition that the lane line observed value corresponding to the (j + 1) th time of updating is smaller than a preset threshold value;
and under the condition that the lane line observation value corresponding to the j +1 th update is larger than or equal to a preset threshold value, updating the third error value of the error variable in the first relative state for the j +2 th time according to the predicted value of the covariance matrix and the lane line observation matrix corresponding to the j +1 th update until the lane line observation value is smaller than the preset threshold value, and taking the updated third error value as the second error value.
7. The method of claim 5, wherein the obtaining the lane line observation matrix at the second time comprises:
acquiring a first lane line equation at the second moment and a second lane line equation at the first moment;
respectively sampling the first lane line equation and the second lane line equation to obtain a plurality of first sampling points on a first lane line and a plurality of second sampling points on a second lane line;
determining two target second sampling points corresponding to each first sampling point from the plurality of second sampling points;
determining a first position of each first sampling point in a vehicle body coordinate system at the first moment;
determining an observation matrix corresponding to each first sampling point according to a first position corresponding to each first sampling point and second positions of the two target second sampling points in a vehicle body coordinate system at the first moment;
and determining the lane line observation matrix according to the observation matrixes respectively corresponding to the plurality of first sampling points.
8. The method according to any one of claims 1-7, wherein said obtaining a first predicted value of a first relative state of the vehicle at a second time relative to a first time comprises:
acquiring a second estimated value of a second relative state of the vehicle at the first time relative to a third time and a measured value of an inertia measuring unit at the second time; wherein the third time is earlier than the first time;
and acquiring a first predicted value of the first relative state according to the second estimated value and the measured value of the second relative state.
9. A vehicle position and attitude data determination apparatus comprising:
the acquisition module is used for acquiring first position and attitude data of a vehicle at a first moment, a first predicted value of a first relative state of the vehicle at a second moment relative to the first moment and a first error value of an error variable corresponding to the first relative state; wherein the first time is earlier than the second time;
a first determination module to determine a first estimate of the first relative state based on the first error value and the first predicted value;
and the second determining module is used for determining second position and posture data of the vehicle at the second moment according to the first position and posture data and the first estimation value.
10. The apparatus of claim 9, wherein the means for obtaining comprises:
a first obtaining unit, configured to obtain a vehicle speed observation matrix at the second time, an estimated value of a covariance matrix between a position and an attitude of the vehicle at the first time, and a second error value of an error variable corresponding to a second relative state of the vehicle at the first time with respect to a third time; wherein the third time is earlier than the first time;
a first determining unit, configured to determine a predicted value of the covariance matrix at the second time according to the estimated value of the covariance matrix at the first time;
and the first updating unit is used for updating the second error value according to the predicted values of the vehicle speed observation matrix and the covariance matrix so as to determine the first error value.
11. The apparatus of claim 10, wherein the first updating unit is to:
determining a filter gain matrix at the second moment according to the vehicle speed observation matrix and the predicted value of the covariance matrix;
and updating the second error value according to the filter gain matrix and the first speed of the vehicle in the longitudinal direction under the vehicle body coordinate system at the second moment so as to determine the first error value.
12. The apparatus of claim 10, wherein the first obtaining unit is configured to:
acquiring a first speed of the vehicle in the longitudinal direction under a vehicle body coordinate system at the second moment and a second speed of the first speed under the vehicle body coordinate system at the first moment;
and acquiring the vehicle speed observation matrix according to the first vehicle speed, the second speed and a rotation matrix from the second speed to the first speed.
13. The apparatus of claim 9, wherein the means for obtaining comprises:
a second obtaining unit, configured to obtain a lane line observation matrix at the second time, an estimated value of a covariance matrix between a position and an attitude of the vehicle at the first time, and a second error value of an error variable corresponding to a second relative state of the vehicle at the first time with respect to a third time; wherein the third time is earlier than the first time;
a second determining unit, configured to determine a predicted value of the covariance matrix at the second time according to the estimated value of the covariance matrix at the first time;
and the second updating unit is used for updating the second error value according to the lane line observation matrix and the predicted value of the covariance matrix so as to determine the first error value.
14. The apparatus of claim 13, wherein the second updating unit is configured to:
acquiring a lane line observation matrix corresponding to the jth update and a third error value of the error variable of the first relative state obtained by the jth update; wherein j is a natural number;
determining a filter gain matrix obtained by the jth update according to the predicted value of the covariance matrix and the corresponding lane line observation matrix updated at the jth time;
determining an updating error corresponding to the j +1 th updating relative to the j th updating according to the filtering gain matrix obtained by the j th updating, the lane line observation matrix corresponding to the j th updating and a third error value obtained by the j th updating;
determining a third error value obtained by the j +1 th update according to the update error corresponding to the j +1 th update and the third error value corresponding to the j +1 th update;
determining the lane line observation value corresponding to the j +1 th update;
determining a third error value obtained by updating the (j + 1) th time as the second error value under the condition that the lane line observed value corresponding to the (j + 1) th time of updating is smaller than a preset threshold value;
and under the condition that the lane line observation value corresponding to the j +1 th update is larger than or equal to a preset threshold value, updating the third error value of the error variable in the first relative state for the j +2 th time according to the predicted value of the covariance matrix and the lane line observation matrix corresponding to the j +1 th update until the lane line observation value is smaller than the preset threshold value, and taking the updated third error value as the second error value.
15. The apparatus of claim 13, wherein the second obtaining unit is configured to:
acquiring a first lane line equation at the second moment and a second lane line equation at the first moment;
respectively sampling the first lane line equation and the second lane line equation to obtain a plurality of first sampling points on the first lane line and a plurality of second sampling points on the second lane line;
determining two target second sampling points corresponding to each first sampling point from the plurality of second sampling points;
determining a first position of each first sampling point in a vehicle body coordinate system at the first moment;
determining an observation matrix corresponding to each first sampling point according to a first position corresponding to each first sampling point and second positions of the two target second sampling points in the vehicle body coordinate system at the first moment;
and determining the lane line observation matrix according to the observation matrixes respectively corresponding to the plurality of first sampling points.
16. The apparatus of any one of claims 9-15, wherein the means for obtaining is configured to:
acquiring a second estimated value of a second relative state of the vehicle at the first time relative to a third time and a measured value of an inertia measuring unit at the second time; wherein the third time is earlier than the first time;
and acquiring a first predicted value of the first relative state according to the second estimated value and the measured value of the second relative state.
17. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-8.
18. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-8.
19. A computer program product comprising a computer program which, when executed by a processor, carries out the steps of the method of any one of claims 1 to 8.
CN202210933228.0A 2022-08-04 2022-08-04 Method and device for determining vehicle position and attitude data and electronic equipment Pending CN115307642A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210933228.0A CN115307642A (en) 2022-08-04 2022-08-04 Method and device for determining vehicle position and attitude data and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210933228.0A CN115307642A (en) 2022-08-04 2022-08-04 Method and device for determining vehicle position and attitude data and electronic equipment

Publications (1)

Publication Number Publication Date
CN115307642A true CN115307642A (en) 2022-11-08

Family

ID=83858638

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210933228.0A Pending CN115307642A (en) 2022-08-04 2022-08-04 Method and device for determining vehicle position and attitude data and electronic equipment

Country Status (1)

Country Link
CN (1) CN115307642A (en)

Similar Documents

Publication Publication Date Title
JP7299261B2 (en) Vehicle dead reckoning method, apparatus, device, storage medium, and program
CN108519090B (en) Method for realizing double-channel combined attitude determination algorithm based on optimized UKF algorithm
CN113377888B (en) Method for training object detection model and detection object
CN113183975B (en) Control method, device, equipment and storage medium for automatic driving vehicle
CN114323033A (en) Positioning method and device based on lane lines and feature points and automatic driving vehicle
EP4160271A1 (en) Method and apparatus for processing data for autonomous vehicle, electronic device, and storage medium
EP4194809A1 (en) Method for obtaining confidence of measurement value based on multi-sensor fusion and autonomous vehicle
CN114140759A (en) High-precision map lane line position determining method and device and automatic driving vehicle
RU2647205C2 (en) Adaptive strap down inertial attitude-and-heading reference system
CN113554712B (en) Registration method and device of automatic driving vehicle, electronic equipment and vehicle
CN115900697B (en) Object motion trail information processing method, electronic equipment and automatic driving vehicle
US20230123671A1 (en) Localization and mapping
CN115727871A (en) Track quality detection method and device, electronic equipment and storage medium
CN115307642A (en) Method and device for determining vehicle position and attitude data and electronic equipment
CN115792985A (en) Vehicle positioning method and device, electronic equipment, storage medium and vehicle
CN114882587A (en) Method, apparatus, electronic device, and medium for generating countermeasure sample
CN114413898B (en) Multi-sensor data fusion method and device, computer equipment and storage medium
CN114088104B (en) Map generation method under automatic driving scene
CN113946151A (en) Data processing method and device for automatic driving vehicle and automatic driving vehicle
CN115953414A (en) Semantic segmentation-based short obstacle detection method and automatic driving vehicle
CN113920174A (en) Point cloud registration method, device, equipment, medium and automatic driving vehicle
CN113218389A (en) Vehicle positioning method, device, storage medium and computer program product
CN112880664A (en) Positioning method, device and equipment of driving equipment and storage medium
CN116448105B (en) Pose updating method and device, electronic equipment and storage medium
CN115096304B (en) Delay error correction method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination