CN114964270A - Fusion positioning method and device, vehicle and storage medium - Google Patents

Fusion positioning method and device, vehicle and storage medium Download PDF

Info

Publication number
CN114964270A
CN114964270A CN202210540746.6A CN202210540746A CN114964270A CN 114964270 A CN114964270 A CN 114964270A CN 202210540746 A CN202210540746 A CN 202210540746A CN 114964270 A CN114964270 A CN 114964270A
Authority
CN
China
Prior art keywords
measurement source
fusion positioning
current frame
data
target measurement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210540746.6A
Other languages
Chinese (zh)
Other versions
CN114964270B (en
Inventor
张丹
朱昊
李金珂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uisee Technologies Beijing Co Ltd
Original Assignee
Uisee Technologies Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uisee Technologies Beijing Co Ltd filed Critical Uisee Technologies Beijing Co Ltd
Priority to CN202210540746.6A priority Critical patent/CN114964270B/en
Publication of CN114964270A publication Critical patent/CN114964270A/en
Application granted granted Critical
Publication of CN114964270B publication Critical patent/CN114964270B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C22/00Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers, using pedometers

Abstract

The embodiment of the invention discloses a fusion positioning method, a fusion positioning device, a vehicle and a storage medium, wherein the fusion positioning method comprises the following steps: acquiring odometer data and measurement source data of at least one path of measurement source when a fusion positioning event is triggered; screening out target measurement sources with effective time confidence according to timestamps in the measurement source data, updating target measurement source data corresponding to the target measurement sources through odometer data, and generating current frame measurement source data of each path of target measurement sources; judging whether a target measurement source corresponding to the current frame measurement source data participates in fusion positioning or not according to the current frame measurement source data and the previous frame fusion positioning result; and for the target measurement source participating in fusion positioning, generating a current frame fusion positioning result by adopting a set filter based on the corresponding current frame measurement source data. According to the embodiment, the target measurement source data is updated by using the odometer data, so that a more accurate fusion positioning result can be obtained, and the fusion positioning result is generated by adopting the setting filter, so that the robustness of fusion positioning is further improved.

Description

Fusion positioning method and device, vehicle and storage medium
Technical Field
The invention relates to the technical field of automatic driving, in particular to a fusion positioning method, a fusion positioning device, a vehicle and a storage medium.
Background
The fusion positioning is to combine various positioning technologies according to characteristics and application requirements, and to realize mutual promotion by fusing various positioning information through a specific algorithm, so that the positioning performance and effect are improved.
In the current automatic driving scene, positioning information in the driving process of the vehicle is acquired by adopting a plurality of sensors, and the positioning information acquired by the plurality of sensors is fused by a Kalman filter, so that the high-precision positioning of the vehicle is realized.
The inventor finds that the time stamps of the positioning information of the multi-channel sensors in the related fusion positioning technology are different in the process of implementing the invention, so that the high-precision positioning information cannot be obtained.
Disclosure of Invention
The invention provides a fusion positioning method, a fusion positioning device, a vehicle and a storage medium, which can improve the precision of a fusion positioning result.
According to an aspect of the present invention, there is provided a fusion localization method, including: acquiring odometer data and measurement source data of at least one path of measurement source when a fusion positioning event is triggered;
screening out target measurement sources with effective time confidence according to timestamps in the measurement source data, updating target measurement source data corresponding to the target measurement sources through the mileage data, and generating current frame measurement source data of each path of target measurement sources;
judging whether a target measurement source corresponding to the current frame measurement source data participates in fusion positioning or not according to the current frame measurement source data and the previous frame fusion positioning result;
and for the target measurement source participating in fusion positioning, generating a current frame fusion positioning result by adopting a set filter based on the corresponding current frame measurement source data.
Optionally, after generating the current frame measurement source data of each path of target measurement source, the method further includes:
clustering the target measurement sources based on the current frame measurement source data of each path of target measurement source, and adding a clustering mark to each path of target measurement source according to a clustering result.
Optionally, the method further comprises:
determining a first measurement source which is successfully clustered in the target measurement sources according to the clustering marks;
acquiring cluster center data of the first measurement source, and determining a first distance difference value and a first course difference value of the cluster center data and a current frame first fusion positioning result, wherein the current frame first fusion positioning result comprises position data and a course angle in the current frame fusion positioning result;
determining a starting timestamp fusing the positioning deviation according to the first distance difference and the first course angle difference;
recording the duration of the fusion positioning deviation based on the starting timestamp, and determining that the fusion positioning state is a deviation state when the duration meets a set condition;
and when the fusion positioning state is a deviation state, determining that the first fusion positioning result of the current frame triggers the filter to reset.
Optionally, after the filter reset is triggered by the current frame first fusion positioning result, the method further includes:
and acquiring the average value of the current frame measurement source data corresponding to each first measurement source, and updating the state quantity of the set filter according to the average value so as to realize filter resetting.
Optionally, the method further comprises:
if it is determined that a first measurement source which is successfully clustered does not exist in the target measurement sources according to the clustering marks, determining a second measurement source according to a timestamp corresponding to the target measurement source, and timing by taking the timestamp corresponding to the second measurement source as a timing starting point;
and if the first measurement source which is successfully clustered does not exist within the set timeout time, updating the state quantity of the set filter according to the current frame measurement source data corresponding to the second measurement source so as to realize filter resetting.
Optionally, after the filter is reset, the method further includes:
updating the previous frame fusion positioning result according to the odometer data to obtain a current frame fusion positioning updating result;
and predicting and updating the state quantity of the set filter and the variance of the prediction error according to the current frame fusion positioning updating result and the previous frame fusion positioning result.
Optionally, the generating a current frame fusion positioning result based on the corresponding current frame measurement source data by using the setting filter includes:
acquiring the weight of each path of target measurement source according to the confidence coefficient and the preset weight in the current frame measurement source data participating in the fusion positioning;
acquiring the observed quantity of the set filter according to the weight of each path of target measurement source and the corresponding current frame measurement source data;
and generating a current frame fusion positioning result according to the obtained observed quantity of the set filter and the predicted and updated state quantity.
Optionally, after obtaining the observed quantity of the set filter according to the weight of each target measurement source and the corresponding current frame measurement source data, the method further includes:
and observing and updating the variance of the observation error of the set filter according to the variance of the prediction error after prediction updating and the mean value of the observation error of the set filter.
Optionally, after obtaining the weight of each target measurement source, the method further includes:
acquiring fusion positioning confidence of the current frame fusion positioning result according to the weight and the confidence of each target measurement source participating in fusion;
and determining the fusion positioning state of the current frame fusion positioning result according to the fusion positioning confidence.
Optionally, the method further comprises:
and if no target measurement source participates in the fusion positioning, taking the current frame fusion positioning updating result as the current frame fusion positioning result, and updating the fusion positioning state into a pure motion estimation state.
Optionally, after updating the fused positioning state to the pure motion estimation state, the method further includes:
and acquiring the duration or the duration distance of the pure motion estimation state, and judging whether the filter is reset by the current frame fusion positioning result or not based on the duration or the duration distance.
Optionally, the updating, by the mileage count data, target measurement source data corresponding to the target measurement source to generate current frame measurement source data of each path of target measurement source includes:
determining a timestamp difference value according to the current timestamp and the timestamp in the target measurement source data;
and generating a compensation coefficient according to the timestamp difference, the odometer data and the vehicle wheel base, updating target measurement source data corresponding to each target measurement source according to the compensation coefficient, and generating current frame measurement source data of each target measurement source.
Optionally, the determining, according to the result of fusion positioning between the current frame measurement source data and the previous frame, whether a target measurement source corresponding to the current frame measurement source data participates in fusion positioning includes:
determining a second distance difference value and a second course angle difference value of the current frame measurement source data of each path of target measurement source and the previous frame fusion positioning result;
acquiring a first preset group of threshold values and a second preset group of threshold values, wherein the second group of threshold values are larger than the first group of threshold values, and each group of threshold values comprises a distance threshold value and a course threshold value;
for each path of target measurement source, when the second distance difference and the second course angle difference are both smaller than the first group of threshold values, determining that the corresponding target measurement source participates in fusion positioning;
when the second distance difference or the second course angle difference is equal to or larger than the first group of threshold values, comparing the second distance difference and the second course angle difference with the second group of threshold values;
and if the second distance difference and the second heading angle difference are both smaller than the second group of thresholds, determining that the corresponding target measurement source participates in the fusion positioning.
Optionally, after the current frame fusion positioning result is generated based on the corresponding current frame measurement source data by using the setting filter, the method further includes:
determining a third distance difference value and a third course angle difference value between current frame measurement source data of each path of target measurement source and a current frame second fusion positioning result, wherein the current frame second fusion positioning result comprises position data and a course angle in the current frame fusion positioning result;
judging whether each path of target measurement source is in an abnormal state or not according to the third distance difference and the third course angle difference;
if so, sending abnormal state information to the corresponding target measurement source to indicate the corresponding target measurement source to reset.
According to another aspect of the present invention, there is provided a fusion positioning apparatus comprising: the data acquisition module is used for acquiring odometer data and measurement source data of at least one path of measurement source when the fusion positioning event is triggered;
the data updating module is used for screening out target measurement sources with effective time confidence according to the timestamps in the measurement source data, updating the target measurement source data corresponding to the target measurement sources through the mileage data and generating current frame measurement source data of each path of target measurement sources;
the participation fusion judging module is used for judging whether a target measurement source corresponding to the current frame measurement source data participates in fusion positioning or not according to the current frame measurement source data and the previous frame fusion positioning result;
and the fusion positioning module is used for generating a current frame fusion positioning result for the target measurement source participating in fusion positioning based on the corresponding current frame measurement source data by adopting a setting filter.
According to another aspect of the present invention, there is provided a vehicle including: at least one measurement source for providing measurement source data while the vehicle is in motion; an odometer for providing odometer data during vehicle travel;
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor to enable the at least one processor to perform a method of fusion localization as described in any of the embodiments of the present invention.
According to another aspect of the present invention, there is provided a computer-readable storage medium storing computer instructions for causing a processor to implement a method of fusion localization according to any one of the embodiments of the present invention when executed.
According to the technical scheme of the embodiment of the invention, more accurate fusion positioning can be obtained by updating the target measurement source data by using the odometer data, and whether the target measurement source corresponding to the current frame measurement source data participates in the fusion positioning or not is judged according to the current frame measurement source data and the previous frame fusion positioning result, so that the robustness of the fusion positioning can be improved, and the problem that the related fusion positioning technology in the prior art cannot obtain high-precision positioning information is solved.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present invention, nor do they necessarily limit the scope of the invention. Other features of the present invention will become apparent from the following description.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart of a fusion positioning method according to an embodiment of the present invention;
FIG. 2 is a flow chart of a fusion positioning method according to an embodiment of the present invention;
FIG. 3 is a flow chart of a fusion positioning method according to an embodiment of the present invention;
FIG. 4 is a flow chart of a fusion positioning method according to an embodiment of the present invention;
FIG. 5 is a schematic structural diagram of a fusion positioning apparatus according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a vehicle implementing a fusion positioning method according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Fig. 1 is a flowchart of a fusion positioning method according to an embodiment of the present invention, where the embodiment is applicable to an automatic driving situation, the method may be executed by a fusion positioning device, and the fusion positioning device may be implemented in a form of hardware and/or software, and the fusion positioning device may be integrated and configured in a vehicle. As shown in fig. 1, the method includes:
and S110, acquiring the odometer data and the measurement source data of at least one path of measurement source when the fusion positioning event is triggered.
In the embodiment of the invention, after the fusion positioning event is triggered, fusion positioning is executed. The trigger condition of the fused positioning event may be various, and the embodiment of the present invention is not particularly limited. For example, the fused position may be triggered by timer timing. Or the time interval of the fusion positioning with the last frame meets the set time interval, and the fusion positioning is triggered. Alternatively, fusion localization is triggered periodically.
In the embodiment of the invention, a timer mode is utilized to trigger the fusion positioning event at set time intervals. Optionally, the triggering time of the fused positioning event may be determined as a fused positioning timestamp of the current frame, and the fused positioning timestamp coincides with the time of the timer. The triggering of the fusion positioning event of the two frames before and after keeps the same time interval, optionally, the time interval may be set as a first time threshold, the parameter value of the first time threshold is not specifically limited in this embodiment, and flexible configuration may be performed according to the actual application scenario. For example, the first time threshold is a threshold of the fusion positioning frame rate, and may be 10ms to 50 ms.
In the embodiment of the invention, in the mode of triggering the fusion positioning by using the time interval with the previous frame, the ending moment of the fusion positioning of the previous frame is used as the starting point of timing, and the fusion positioning is triggered when the timing time meets the set time interval.
In the method for periodically triggering fusion positioning in the embodiment of the invention, fusion positioning is periodically triggered according to a preset trigger period value.
The odometer data may be data obtained by measuring or calculating an odometer, and the odometer data acquired in the application may include a time stamp, a vehicle speed, front wheel steering angle data, and the like.
The measurement source data may be data measured or calculated by the measurement source, and the measurement source data may include a timestamp, an east position, a north position, an elevation, a pitch angle, a roll angle, a heading angle, a confidence coefficient, and the like. The measurement source may be various available measurement source devices for positioning in the vehicle, such as a GPS (Global positioning System), a visual SLAM (Simultaneous Localization and Mapping), a laser SLAM positioning, a visual semantic positioning, and a laser semantic positioning, as long as a positioning function can be implemented for the vehicle, and the embodiment of the present invention does not limit the type and number of the measurement source.
S120, screening out target measurement sources with effective time confidence according to the time stamps in the measurement source data, updating the target measurement source data corresponding to the target measurement sources through the odometer data, and generating current frame measurement source data of each path of target measurement sources.
The time confidence is used for judging whether the measurement source can participate in fusion positioning or not, and the measurement source with the effective time confidence can participate in fusion positioning. The time stamps of the measurement sources and the fusion positioning time stamps can be compared, and the time confidence level state of the corresponding measurement source is determined according to the comparison result. Specifically, if the difference between the timestamp of a certain measurement source and the fused positioning timestamp of the current frame is smaller than the second time threshold, and the confidence of the measurement source is greater than the first confidence threshold, it is determined that the time confidence of the measurement source is valid for the fused positioning of the current frame, otherwise, it is determined that the time confidence of the measurement source is invalid. It should be noted that, in this embodiment, both the second time threshold and the first confidence threshold may be set as needed. For example, the second time threshold is a threshold for determining whether the measurement source is valid, and a value of the second time threshold may be 20ms to 1000 ms. The first confidence threshold is a threshold for judging whether the measurement source is valid, and the value of the first confidence threshold can be 0.2-0.4.
In the embodiment of the invention, the measurement source with the effective time confidence is set as the target measurement source, because the target measurement source is possibly inconsistent with the fusion positioning timestamp, and the odometer and the measurement source are both devices in the vehicle, the odometer data can be used for positioning the vehicle, and the measurement source data can directly acquire the positioning data of the vehicle, the odometer data can be adopted to compensate the target measurement source data of the target measurement source with the effective time confidence, and the target measurement source data of the target measurement source is aligned to the fusion positioning timestamp.
According to the embodiment of the invention, the target measurement source data is updated by using the odometer data, so that more accurate fusion positioning data can be obtained.
For example, updating target measurement source data corresponding to the target measurement source by the odometry data, and generating current frame measurement source data of each path of target measurement source may include: determining a timestamp difference value according to the current timestamp and a timestamp in the target measurement source data; and generating a compensation coefficient according to the timestamp difference, the odometer data and the vehicle wheel base, updating target measurement source data corresponding to each target measurement source according to the compensation coefficient, and generating current frame measurement source data of each target measurement source.
And calculating the time stamp difference between the current time stamp and the time stamp of the target measurement source with the effective time confidence.
The compensation coefficient is used for carrying out time compensation on the target measurement source data of the current frame to obtain the measurement source data of the current frame. The compensation coefficient comprises a first compensation coefficient D, a second compensation coefficient R and a third compensation coefficient C x And a fourth compensation coefficient C y . The first compensation factor D may be generated based on the current frame odometer vehicle speed and the timestamp difference. The second compensation factor R may be generated based on the vehicle wheel base and the current frame odometer front wheel steering angle. The third compensation coefficient C may be generated based on the east position included in the target measurement source data of the current frame, the second compensation coefficient R, and the heading angle included in the target measurement source data of the current frame x . And generating a fourth compensation coefficient C based on the north position included in the target measurement source data of the current frame, the second compensation coefficient R, and the target measurement source data of the current frame y
In one case, updating target measurement source data corresponding to each path of target measurement source according to the compensation coefficient, and generating current frame measurement source data of each path of target measurement source, which may specifically include: according to the first compensation coefficient D, the second compensation coefficient R and the third compensation coefficient C x And a fourth compensation coefficient C y Update each courseEast position x corresponding to mapping quantity source m North position y m Elevation z m And a pitch angle
Figure BDA0003648249710000071
Transverse roll angle phi m And heading angle theta m And generating the current frame measurement source data of each path of target measurement source compensated by the mileage data.
Optionally, in the embodiment of the present invention, the current frame measurement source data is specifically generated by using the following formula:
Δt=t i -t m ; (1.1)
D=V i *Δt; (1.2)
R=L/tanS i ; (1.3)
C x =x m -R*sinθ m ; (1.4)
C y =y m +R*cosθ m ; (1.5)
θ' m =θ m +D/R; (1.6)
x' m =C x +D*cosθ' m ; (1.7)
y' m =C y +D*sinθ' m ; (1.8)
z' m =z m ; (1.9)
Figure BDA0003648249710000081
φ' m =φ m ; (1.11)
Figure BDA0003648249710000082
Figure BDA0003648249710000083
wherein, t i Is the current timestamp; t is t m Measuring source data for a targetA timestamp of (d); v i Speedometer speed for the current frame mile; s. the i The rotation angle of a front wheel is counted for the current frame milestone; l is the vehicle wheel base; x m Measuring source data for the current frame before updating, including east position x of target measuring source m North orientation position y m Elevation z m Angle of pitch
Figure BDA0003648249710000084
Transverse roll angle phi m And heading angle theta m ;X' m Measuring source data for the updated current frame, including east position x 'of the target measurement source' m North orientation position y' m Z 'of elevation' m Angle of pitch
Figure BDA0003648249710000085
Transverse roll angle phi' m And heading angle theta' m
S130, judging whether the target measurement source corresponding to the current frame measurement source data participates in fusion positioning or not according to the current frame measurement source data and the previous frame fusion positioning result.
In the embodiment of the invention, the fusion positioning is triggered periodically, and the result of the previous frame of fusion positioning is positioning information obtained by fusing the previous frame of measurement source data before the current frame of measurement source data is fused and positioned.
It should be noted that the target measurement source with the valid time confidence may not participate in the fusion positioning for some reasons, and it needs to determine whether the target measurement source corresponding to the current frame of measurement source data may participate in the fusion positioning based on the east position, the north position, and the heading angle in the current frame of measurement source data and the corresponding parameters in the previous frame of fusion positioning result.
Illustratively, S130 may include: determining a second distance difference value and a second course angle difference value of the current frame measurement source data of each path of target measurement source and the previous frame fusion positioning result; acquiring a first preset group of threshold values and a second preset group of threshold values, wherein the second group of threshold values are larger than the first group of threshold values, and each group of threshold values comprises a distance threshold value and a course threshold value; for each path of target measurement source, determining that the corresponding target measurement source participates in fusion positioning when the second distance difference and the second course angle difference are both smaller than the first group of threshold values; when the second distance difference or the second course angle difference is equal to or larger than the first group of threshold values, comparing the second distance difference and the second course angle difference with a second group of threshold values; and if the second distance difference and the second heading angle difference are both smaller than the second group of thresholds, determining that the corresponding target measurement source participates in the fusion positioning.
In the implementation of the invention, the distance difference d between the current frame measurement source data and the previous frame fusion positioning result needs to be calculated m And course angle difference delta theta m The distance difference d is calculated by the following formula m And course angle difference delta theta m
Figure BDA0003648249710000091
Δθ m =|θ m -θ|; (1.15)
Wherein x is m ,y mm And respectively measuring an east position, a north position and a course angle included in the source data for the current frame, wherein x, y and theta are respectively the east position, the north position and the course angle included in the previous frame fusion positioning result.
In the embodiment of the invention, the distance difference d is used m And course angle difference delta theta m And judging whether the target measurement source corresponding to the current frame measurement source data participates in fusion positioning. Specifically, when d m Is less than a set first distance threshold value and delta theta m And when the target measurement source is smaller than the set first course threshold value, determining that the target measurement source participates in the fusion positioning, otherwise, determining that the target measurement source does not participate in the fusion positioning. If all the target measurement sources are judged not to participate in the fusion positioning, the following judgment is continued. The first distance threshold is a first heavy distance threshold for judging that the measurement sources with effective time confidence are merged, and the value can be 0.2 m-0.5 m. The first course threshold is a first re-course threshold for judging that the time confidence coefficient is effective and the measurement source enters the fusion, and the value can be 0.5 deg-1.0 deg.
When d is m Less than setSecond distance threshold value and delta theta m And when the target measurement source is smaller than the set second course threshold value, determining that the target measurement source participates in the fusion positioning, otherwise, determining that the target measurement source does not participate in the fusion positioning. The second distance threshold is a second heavy distance threshold for judging that the time confidence is effective and the measurement source enters the fusion, and the value can be 0.5 m-1.0 m. The second course threshold is a second re-course threshold for judging that the time confidence coefficient is effective and the measurement source enters the fusion, and the value can be 1.0 deg-2.0 deg.
The embodiment of the invention is provided with two groups of threshold values, namely a first group of threshold values comprises a first distance threshold value and a first course threshold value, a second group of threshold values comprises a second distance threshold value and a second course threshold value, the second distance threshold value included by the second group of threshold values is larger than the first distance threshold value included by the first group of threshold values, and the second course threshold value included by the second group of threshold values is larger than the first course threshold value. Whether each path of target measurement source participates in comprehensive positioning or not is judged through two groups of thresholds with different values, so that more measurement sources can participate in fusion positioning, and the robustness of the fusion positioning is improved.
And S140, generating a current frame fusion positioning result based on the corresponding current frame measurement source data by adopting a set filter for the target measurement source participating in fusion positioning.
The setting filter may be a filter capable of fusing data of multiple measurement sources to obtain positioning information, and the selection of the filter in the embodiment of the present invention is not particularly limited.
And the current frame fusion positioning result is positioning information generated after the current frame measurement source data is fused.
In the embodiment of the present invention, after the fused positioning event is triggered, fused positioning data and measurement source state data may be output, where the fused positioning data may include: the system comprises a timestamp, an east position, a north position, an elevation, a pitch angle, a roll angle, a course angle, a confidence degree, fusion positioning state data and the like, wherein the fusion positioning state data can comprise an un-reset success state, a high-precision positioning state, a low-precision positioning state, a pure motion estimation state, a deviation state and the like; measuring source state data may include: temporal confidence valid/invalid, engaged/not engaged fusion, and abnormal/normal, etc.
In the embodiment of the present invention, a kalman filter may be used to generate the current frame fusion positioning result, specifically, the system equation of the kalman filter is as follows:
X t =X t-1 +u t +w t-1 ; (1.16)
Z t =X t +v t ; (1.17)
Figure BDA0003648249710000101
Figure BDA0003648249710000102
wherein, the state quantity X of the filter is the fusion positioning result, including the east position X, the north position y, the elevation z and the pitch angle
Figure BDA0003648249710000103
Roll angle
Figure BDA0003648249710000104
And a heading angle theta. The observed quantity Z of the filter is the current frame measurement source data participating in fusion positioning, including east position x m North orientation position y m Elevation z m Angle of pitch
Figure BDA0003648249710000105
Transverse roll angle phi m And heading angle theta m ;X t A fusion positioning result of the Kalman filter corresponding to the current frame; z t An observed quantity corresponding to the current frame for the Kalman filter; x t-1 For the last frame to fuse the positioning results u t An increment for motion update of a target measurement source of a current frame based on odometry data; w is a t-1 Filter prediction error, v, for previous frame fusion positioning t And fusing the observation error of the positioned filter for the current frame.
Optionally, other types of filters may also be used in the embodiments of the present invention, and when other types of filters are used, the system equation of the filter may change.
In the embodiment of the invention, in the initialization link of the filter, the mean value of the prediction error w and the observation error v is set as the parameter matrixes Q and R, and the variance of the prediction error w and the observation error v is set as P - And P, initialized to 0.
And under the condition that the fusion positioning state of the current frame meets the set condition, resetting the filter before performing fusion positioning on the next frame of measurement source data corresponding to each path of measurement source participating in the fusion positioning.
In the embodiment of the invention, more accurate fusion positioning can be obtained by updating the target measurement source data by using the odometer data, and whether the target measurement source corresponding to the current frame of measurement source data participates in the fusion positioning or not can be judged according to the fusion positioning result of the current frame of measurement source data and the previous frame of measurement source data, so that the robustness of the fusion positioning can be improved.
Fig. 2 is a flowchart of another fusion positioning method according to an embodiment of the present invention, where after current frame measurement source data of each path of target measurement source is generated in the step of the embodiment of the present invention, the method further includes clustering the target measurement sources, attaching a cluster marker to each path of target measurement source according to a clustering result, determining a fusion positioning state based on a cluster center, and triggering a filter to reset when the fusion positioning state is a biased state. As shown in fig. 2, the method includes:
s210, acquiring odometer data and measurement source data of at least one path of measurement source when the fusion positioning event is triggered.
S220, screening out target measurement sources with effective time confidence according to the time stamps in the measurement source data, updating the target measurement source data corresponding to the target measurement sources through the odometer data, and generating current frame measurement source data of each path of target measurement sources.
And S230, clustering the target measurement sources based on the current frame measurement source data of each path of target measurement source, and adding a clustering mark to each path of target measurement source according to a clustering result.
It should be noted that, some or all data in the current frame measurement source data of each path of target measurement source may be used to cluster the target measurement sources, and specifically, which data is used for clustering may be freely set according to the actual application scenario, which is not specifically limited in the embodiment of the present invention.
Optionally, in the embodiment of the present invention, the target measurement sources are clustered by using the east position, the north position and the heading angle in the current frame measurement source data of each target measurement source.
In the embodiment of the invention, the following formula can be adopted to calculate the distance difference dij and the course difference delta theta between the target measurement sources ij
Figure BDA0003648249710000111
Δθ ij =|θ ij |; (2.2)
Wherein x is i ,y ii Respectively measuring the east position, the north position, the course angle and the x included in the source data of the current frame corresponding to the measurement source i j ,y jj And respectively measuring the east position, the north position and the heading angle of the source data of the current frame corresponding to the measurement source j.
It should be noted that the measurement source i and the measurement source j are any two measurement sources in each path of target measurement source, including a measurement source participating in fusion positioning and a measurement source not participating in fusion positioning. In this embodiment, can pass through d ij Value of (a) and Δ θ ij And (4) judging whether the target measurement source is successfully clustered. For example, when d ij Less than a set third distance threshold and Δ θ ij And when the current value is less than the set third heading threshold value, determining that the clustering of the measurement source i and the measurement source j is successful. Otherwise, determining that the clustering of the measurement source i and the measurement source j is unsuccessful. And the third distance threshold is a clustering threshold of the measurement source based on the distance data, and the value is 0.5 m-0.8 m. The third course threshold value is a clustering threshold value of the measurement source based on course data, and the value is0.5deg~1.5deg。
Clustering analysis is performed between every two target measurement sources, and an effective clustering mark is given to the measurement source which is successfully clustered, and the effective clustering mark can be embodied in a digital form, such as 1,2, …, which is not specifically limited in this embodiment. The cluster labels of the successfully clustered target measurement sources are consistent. In addition, the target measurement source with unsuccessful clustering is given an invalid clustering mark, and the invalid clustering mark may also be embodied in a digital form, for example, the invalid clustering mark may be-1, which is also not specifically limited in the embodiment of the present invention.
S240, judging whether the target measurement source corresponding to the current frame measurement source data participates in the fusion positioning or not according to the current frame measurement source data and the previous frame fusion positioning result.
And S250, generating a current frame fusion positioning result based on the corresponding current frame measurement source data by adopting a set filter for the target measurement source participating in fusion positioning.
And S260, determining a first measurement source which is successfully clustered in the target measurement sources according to the clustering marks.
Illustratively, the first measurement source of the target measurement sources that is successfully clustered may be determined from the digital form of the cluster marker.
S270, cluster center data of the first measurement source are obtained, and a first distance difference value and a first course difference value of a first fusion positioning result of the cluster center data and the current frame are determined.
And the first fusion positioning result of the current frame comprises position data and a course angle in the fusion positioning result of the current frame.
And the clustering center data is all or part of data of the clustering center of the first measurement source which is successfully clustered. There are many ways to obtain the cluster center, and the embodiment of the present invention is not limited in particular. For example, the geometric center of the current frame measurement source data corresponding to the first measurement source may be determined as the cluster center. Alternatively, a cluster analysis K-Means algorithm is used to determine the cluster center of the first measurement source. And determining whether all data of the cluster center is selected as cluster center data or partial data of the cluster center is selected as cluster center data according to data included in the first fusion positioning result of the current frame.
And obtaining the east position, the north position and the course angle contained in the current frame fusion positioning result to form a first fusion positioning result of the current frame. Correspondingly, the east position, the north position and the course angle of the clustering center are obtained to form clustering center data. And calculating a first distance difference value of the clustering center data and the first fusion positioning result of the current frame according to the east position and the north position. And calculating a first course difference value of the cluster center data and the first fusion positioning result of the current frame according to the course angle. Specifically, in the embodiment of the present invention, the following formula is used to calculate the first distance difference d between the cluster center data and the first fusion positioning result of the current frame ci And a first heading difference value delta theta ci
Figure BDA0003648249710000121
Figure BDA0003648249710000122
Wherein the content of the first and second substances,
Figure BDA0003648249710000123
east position, north position and course angle, x, included in the cluster center data, respectively i ,y ii The first fusion positioning result of the current frame comprises an east position, a north position and a course angle.
And S280, determining a starting timestamp of the fusion positioning deviation according to the first distance difference and the first heading angle difference.
The start timestamp of the fusion positioning deviation refers to a time when the fusion positioning deviation starts to meet a set condition.
For example, when the first distance difference d ci Greater than the set fourth distance threshold and the first heading difference delta theta ci When the current heading is larger than a set fourth heading threshold, recording a fusion positioning deviation starting timestamp t cs . Wherein the fourth distanceThe distance from the threshold is a fusion positioning deviation distance threshold, and the value is 0.8 m-1.0 m. And the fourth course threshold value is a fusion positioning deviation course threshold value, and the value of the fourth course threshold value is 0.5 deg-1.5 deg.
And S290, recording the duration of the fusion positioning deviation based on the starting timestamp, and determining that the fusion positioning state is a deviation state when the duration meets the set condition.
The set condition is a condition for determining whether the fusion positioning state is a deviation state. A third time threshold may be set, and if the duration of the fused positioning deviation is greater than the third time threshold, the duration is determined to satisfy the set condition. And the third time threshold is a fusion positioning deviation time threshold, and the value can be 1 s-2 s.
Exemplarily, if the first distance difference d continues to occur ci Greater than the set fourth distance threshold and the first heading difference delta theta ci And if the current position is larger than the set fourth heading threshold value, determining that the fusion positioning deviation continuously occurs. And recording the time of the continuous occurrence of the fusion positioning deviation to obtain the duration of the fusion positioning deviation with the starting timestamp as a timing starting point.
Specifically, in the embodiment of the present invention, the duration Δ T is calculated by using the following formula:
ΔT=t i -t cs ; (2.6)
wherein, t i A timestamp is located for the fusion of the current frame.
When the duration Δ T is greater than a third time threshold, the fusion position is determined to be in a biased state.
And S2100, when the fusion positioning state is a deviation state, determining that the first fusion positioning result of the current frame triggers the filter to reset.
Illustratively, if the fusion positioning state is a biased state, it is determined that the first fusion positioning result of the current frame triggers a filter reset, and the filter reset is implemented based on the target measurement source data of the next frame, and the measurement source data of the next frame is fused through the filter which is successfully reset. In one case, an average value of the current frame measurement source data corresponding to each first measurement source is obtained, and the state quantity of the filter is updated and set according to the average value, so that the filter is reset. It should be noted that the current frame measurement source data is actually the target measurement source data of the next frame of the fused positioning result that triggers the filter reset. Since the fused localization is performed in a loop, the measurement source data for each current frame may be used for filter reset due to the previous frame fused localization result. For convenience of the following description, the current frame measurement source data is used herein to represent the measurement source data used for the filter reset after the filter reset is triggered.
Illustratively, first measurement sources with the same clustering marks representing successful clustering are obtained, and the average value of the current frame measurement source data corresponding to each first measurement source is calculated by using the following formula;
Figure BDA0003648249710000131
Figure BDA0003648249710000132
wherein, X k The east position, the north position and the course angle of the measurement source with the same cluster mark as the target measurement source with the first cluster mark not being-1; k represents the identity of the first measurement source, and may take the value of k ═ 1,2,3, … …, C; c is the number of the first measurement sources;
Figure BDA0003648249710000133
the average east position, average north position, average elevation, average pitch angle, average roll angle and average heading angle of the first measurement source are obtained.
Updating the state quantity of the current filter to
Figure BDA0003648249710000134
The resetting of the filter is completed.
In another case: if it is determined that the first measurement source which is successfully clustered does not exist in the target measurement sources according to the clustering marks, determining a second measurement source according to a time stamp corresponding to the target measurement source, and timing by taking the time stamp corresponding to the second measurement source as a timing starting point; and if the first measurement source which is successfully clustered does not exist within the set timeout time, updating the state quantity of the set filter according to the current frame measurement source data corresponding to the second measurement source so as to realize filter resetting.
In the embodiment of the present invention, if the clustering flags of all target measurement sources are-1, which indicates that there is no successfully clustered first measurement source, timing is started from the occurrence of the measurement source with the first time confidence being valid (i.e., the second measurement source), and when the timer 1 reaches the fourth time threshold and there is no successfully clustered first measurement source in this period, the state quantity of the current filter is updated to the current frame measurement source data corresponding to the second measurement source, thereby also implementing the resetting of the filter. And the fourth time threshold is a single positioning source reset fusion positioning time threshold of 10 s-20 s.
In the embodiment of the present invention, if the filter is not successfully reset by using the average value of the current frame measurement source data corresponding to the first measurement source or the current frame measurement source data corresponding to the second measurement source, it is determined that the fusion positioning state is the fusion positioning invalid reset state. And when the fusion positioning is triggered, outputting a fusion positioning invalid reset state, and continuously waiting until the clustering is successful, and completing the reset of the filter by using the average value of the current frame measurement source data corresponding to the first measurement source, or else, not successfully clustering but generating a first second measurement source, and completing the reset of the filter by using the current frame measurement source data corresponding to the second measurement source.
And S2110, updating the previous frame fusion positioning result according to the odometer data to obtain a current frame fusion positioning updating result.
In the embodiment of the invention, firstly, the odometer data is used for compensating the fusion positioning result of the previous frame, and the fusion positioning result of the previous frame is aligned to the fusion positioning timestamp of the current frame; specifically, the fusion localization update can be performed by the following formula:
Δt=t i -t i-1 ; (2.9)
D=V i-1 *Δt; (2.10)
R=L/tanS i-1 ; (2.11)
C x =x i-1 -R*sinθ i-1 ; (2.12)
C y =y i-1 +R*cosθ i-1 ; (2.13)
θ' i =θ i-1 +D/R; (2.14)
x' i =C x +D*cosθ i ; (2.15)
y' i =C y +D*sinθ i ; (2.16)
z' i =z i-1 ; (2.17)
Figure BDA0003648249710000141
φ' i =φ i-1 ; (2.19)
Figure BDA0003648249710000142
Figure BDA0003648249710000143
wherein, t i Locating a timestamp, t, for the fusion of the current frame i-1 Locating a timestamp, V, for the previous frame's fusion i-1 Speedometer speed, S, for the last frame -1 The rotation angle of the front wheel is measured for the previous frame of mileometer, L is the vehicle wheelbase, X i-1 And fusing positioning results for the previous frame, wherein the positioning results comprise an east position, a north position, an elevation, a pitch angle, a roll angle and a course angle. X' i And fusing positioning results of the current frame after fusion positioning updating, wherein the fusion positioning results comprise an east position, a north position, an elevation, a pitch angle, a roll angle and a course angle.
S2120, according to the current frame fusion positioning updating result and the previous frame fusion positioning result, predicting and updating the state quantity of the set filter and the variance of the prediction error.
For example, the state quantity of the prediction updated set filter and the variance of the prediction error may be determined using the following equations:
u i =X' i -X i-1 ; (2.22)
Figure BDA0003648249710000152
Figure BDA0003648249710000153
wherein the content of the first and second substances,
Figure BDA0003648249710000154
predicting an updated state quantity for the filter; u. of i The difference value of the current frame fusion positioning result after fusion positioning update and the previous frame fusion positioning result;
Figure BDA0003648249710000155
predicting a variance of the updated prediction error for the filter; p i-1 A variance of filter observation errors corresponding to a previous frame fused position; q is a parameter matrix corresponding to the mean value of the prediction error w.
And S2130, acquiring the weight of each target measurement source according to the confidence coefficient and the preset weight in the current frame measurement source data participating in fusion positioning.
In the embodiment of the invention, for the current frame measurement source data participating in the fusion positioning, the weight of each path of target measurement source is obtained according to the ratio of the product of the confidence coefficient included in the current frame measurement source data and the preset weight (the weight can be consistent in all paths, or different values can be set by regions) to the sum of the products of the confidence coefficients and the weights of all the measurement sources.
The embodiment of the invention can utilize the following formula to calculate the weight of each target measurement source participating in fusion:
W m =c m *w m /(∑ k c k *w k ); (2.25)
wherein, c m For the confidence of the current target measurement source, w m A preset weight corresponding to the current target measurement source, c k Confidence for kth target measurement source, w k And the weight value is preset corresponding to the kth target measurement source.
S2140, obtaining the observed quantity of the set filter according to the weight of each target measurement source and the corresponding current frame measurement source data.
Illustratively, products of current frame measurement source data corresponding to each target measurement source and the weight are respectively calculated, and the corresponding products of each path of target measurement source are accumulated to obtain the observed quantity Z of the filter.
The embodiment of the invention can adopt the following formula to calculate the observed quantity Z of the filter:
Z=∑ m W m X' m ; (2.26)
wherein, W m Represents the weight, X 'of each target measurement source participating in the fusion' m Representing the measurement source data of each current frame participating in the fusion.
S2150, observing and updating the variance of the observation error of the set filter according to the variance of the prediction error after prediction updating and the mean value of the observation error of the set filter.
Illustratively, the variance of the updated prediction error is predicted with a filter
Figure BDA0003648249710000156
And setting the sum of squares of the mean R of the observation errors of the filter as a denominator, and predicting the variance of the updated prediction error with the filter
Figure BDA0003648249710000151
As a numerator, the observation update coefficient K is calculated i ,K i Is a matrix.
In the embodiment of the invention, K can be calculated by adopting the following formula i
Figure BDA0003648249710000161
Wherein the content of the first and second substances,
Figure BDA0003648249710000162
the variance of the updated prediction error is predicted for the filter, and R is the mean of the filter's observed errors represented by the parameter matrix.
Calculating identity matrix and observing update coefficient K i And calculating the variance of the difference with the prediction error of the updated prediction of the wave filter
Figure BDA0003648249710000163
To obtain the variance P of the observation error after the filter observation is updated i
In the embodiment of the invention, the following formula can be adopted to calculate P i
Figure BDA0003648249710000167
S2160, generating a current frame fusion positioning result according to the obtained observed quantity of the set filter and the predicted updated state quantity.
Illustratively, in the case where the filter implements reset and prediction update and observation update are completed, the observation amount Z of the filter and the state amount after prediction update of the filter are calculated
Figure BDA0003648249710000164
And calculating the difference and the observation update coefficient K i Then, the product is multiplied with the filter prediction updated state quantity
Figure BDA0003648249710000165
And performing addition calculation to obtain a current frame fusion positioning result.
In the embodiment of the invention, the current frame fusion positioning result X can be calculated by the following formula i
Figure BDA0003648249710000166
Wherein, X i Namely the current frame fusion positioning result.
And S2170, acquiring fusion positioning confidence of the current frame fusion positioning result according to the weight and the confidence of each target measurement source participating in fusion.
Illustratively, the product of the weight and the confidence of each target measurement source participating in the fusion is calculated, and the products of the target measurement sources participating in the fusion are added to obtain the fusion positioning confidence of the current frame fusion positioning result.
In the embodiment of the invention, the fusion position reliability can be obtained by the following formula:
C i =∑ k c k *W k ; (2.33)
wherein, W k Weights for target measurement sources participating in the fusion, c k Confidence of target measurement sources for participation in fusion, C i Confidence is located for the fusion of the current frame.
And S2180, determining the fusion positioning state of the current frame fusion positioning result according to the fusion positioning confidence.
In the embodiment of the invention, if the confidence level of the fusion location is higher than the set second confidence threshold, the fusion location state is updated to high-precision location, otherwise, the fusion location state is updated to low-precision location. And the second confidence threshold is a threshold for judging the high precision of fusion positioning, and the value is 0.7-0.9.
And further, outputting a planning layer of the unmanned system for fusing positioning information such as a positioning result, a positioning state and the like so as to analyze the positioning information through the planning layer to generate a vehicle control strategy.
The embodiment of the invention can avoid using single measurement source data to reset the filter by resetting the filter by using the measurement source cluster center data, thereby avoiding obtaining biased fusion positioning data, setting the weight of the target measurement source in advance, and simultaneously avoiding the influence of the target measurement source with low positioning accuracy on the fusion positioning.
Fig. 3 is a flowchart of another fusion positioning method according to an embodiment of the present invention, in which the method includes steps of triggering filter reset based on a pure motion estimation state, and performing prediction update on a state quantity of a set filter and a variance of a prediction error. As shown in fig. 3, the method includes:
s310, acquiring odometer data and measurement source data of at least one path of measurement source when the fusion positioning event is triggered.
S320, screening out target measurement sources with effective time confidence according to the time stamps in the measurement source data, updating the target measurement source data corresponding to the target measurement sources through the odometer data, and generating current frame measurement source data of each path of target measurement sources.
S330, judging whether the target measurement source corresponding to the current frame measurement source data participates in the fusion positioning or not according to the current frame measurement source data and the previous frame fusion positioning result.
And S340, if no target measurement source participates in the fusion positioning, updating the previous frame of fusion positioning result according to the odometer data to obtain a current frame of fusion positioning updating result, taking the current frame of fusion positioning updating result as the current frame of fusion positioning result, and updating the fusion positioning state into a pure motion estimation state.
Illustratively, if no target measurement source participates in the fusion positioning, the current frame fusion positioning updating result is used as the current frame fusion positioning result, and the fusion positioning state is updated to the pure motion estimation state.
In the embodiment of the invention, the current frame fusion positioning updating result X 'is calculated through a formula (2.21)' i The detailed description of the specific calculation process is omitted here.
In the embodiment of the invention, if no target measurement source participates in the fusion, the fusion positioning is output as the current frame fusion positioning result X' i And updating the fusion positioning state to a pure motion estimation state.
And S350, acquiring the duration or the duration distance of the pure motion estimation state, and judging whether the filter is reset by the current frame fusion positioning result or not based on the duration or the duration distance.
In the embodiment of the invention, if the fusion positioning states of the continuous multiple frames are all pure motion estimation states, the duration of the pure motion estimation states of the continuous frames can be counted, and if the duration exceeds a set fifth time threshold, the resetting of the filter is triggered, namely, the resetting process of the filter is started. The fifth time threshold is a pure motion estimation time threshold, and the value of the fifth time threshold can be 1 s-5 s.
Optionally, if the fused positioning states of the consecutive frames are all pure motion estimation states, the continuous distance of the pure motion estimation states of the consecutive frames may be counted, and if the continuous distance exceeds a set fifth distance threshold, the reset of the filter may be triggered, that is, the reset process of the filter is entered. The fifth distance threshold is a pure motion estimation distance threshold, and the value of the fifth distance threshold may be 5m to 10 m.
It should be noted that the filter resetting process has been described in the above embodiments, and is not described herein again.
Specifically, the duration of the pure motion estimation state may be determined according to a difference between the current frame fused positioning timestamp and a timestamp of the start of the pure motion estimation state of the current fused positioning.
The calculation formula of the duration time of the pure motion estimation state in the embodiment of the invention is as follows:
ΔT=t i -t s ; (3.1)
wherein, t s Is the timestamp of the start of the fusion positioning pure motion estimation state; t is t i Time stamps are positioned for the current frame fusion.
Optionally, a difference between the fusion positioning timestamp of the kth frame and the fusion positioning timestamp of the (k-1) th frame may be calculated, a product of the difference and the odometer vehicle speed of the kth frame is calculated, and then, a sum of all the products in a time period from a timestamp starting in the current fusion positioning pure motion estimation state to the fusion positioning timestamp of the previous frame is calculated to obtain the duration distance of the pure motion estimation state.
The calculation formula of the continuous distance of the pure motion estimation state in the embodiment of the invention is as follows:
Figure BDA0003648249710000181
wherein, t s Is the timestamp of the start of the fusion positioning pure motion estimation state; t is t i Fusing and positioning a timestamp for the current frame; v k The odometer speed is the kth frame.
The embodiment of the invention can actively reset the fusion location by judging the pure motion estimation state, thereby restoring the fusion location to be normal.
Fig. 4 is a flowchart of another fusion positioning method according to an embodiment of the present invention, where in this embodiment, after the step of generating a current frame fusion positioning result based on corresponding current frame measurement source data by using a setting filter, the method further includes determining whether each path of target measurement source is in an abnormal state, and instructing an abnormal target measurement source to reset. As shown in fig. 4, the method includes
S410, acquiring odometer data and measurement source data of at least one path of measurement source when the fusion positioning event is triggered.
S420, screening out target measurement sources with effective time confidence according to the time stamps in the measurement source data, updating the target measurement source data corresponding to the target measurement sources through the odometer data, and generating current frame measurement source data of each path of target measurement sources.
And S430, judging whether the target measurement source corresponding to the current frame measurement source data participates in the fusion positioning or not according to the current frame measurement source data and the previous frame fusion positioning result.
And S440, generating a current frame fusion positioning result based on the corresponding current frame measurement source data by adopting a set filter for the target measurement source participating in fusion positioning.
S450, determining a third distance difference value and a third course angle difference value between the current frame measurement source data of each path of target measurement source and a current frame second fusion positioning result, wherein the current frame second fusion positioning result comprises position data and a course angle in the current frame fusion positioning result.
In the embodiment of the invention, the following formula can be used for calculating the third distance difference d between the current frame measurement source data and the current frame second fusion positioning mi And a third heading difference value delta theta mi
Figure BDA0003648249710000191
Δθ mi =|θ mi |; (4.2)
Wherein x is m ,y mm The east position, north position and course angle, x, of the source data are measured for the current frame, respectively i ,y ii The east position, the north position and the course angle included in the second fusion positioning result of the current frame are respectively.
And S460, judging whether each road of target measurement source is in an abnormal state or not according to the third distance difference and the third course angle difference, if so, executing S470, and otherwise, executing S480.
In the embodiment of the invention, when the distance d between the current frame measurement source and the fusion location is measured mi Greater than the set sixth distance threshold and the heading difference delta theta mi When the current heading is larger than a set fifth heading threshold, recording an abnormal starting timestamp t of the target measurement source ms . And when the continuous abnormal time delta T of the target measurement source is greater than the set sixth time threshold, determining that the state of the target measurement source is an abnormal state, otherwise, determining that the state of the target measurement source is a normal state. The sixth time threshold is a measurement source abnormal time threshold, and the value can be 1 s-2 s. The sixth distance threshold is a measurement source abnormal distance threshold, and the value can be 0.8 m-1.0 m. And the fifth course threshold value is an abnormal course threshold value of the measuring source, and the value can be 0.5 deg-1.5 deg.
In the embodiment of the invention, the following formula can be used for calculating the continuous abnormal time delta T of the target measurement source:
ΔT=t i -t ms ; (4.4)
wherein, t i Locating timestamps for fusion of current frames,t ms A source anomaly start timestamp is measured for the target.
And S470, sending the abnormal state information to the corresponding target measurement source to indicate the corresponding target measurement source to reset.
In the embodiment of the invention, if the target measurement source is in an abnormal state, the abnormal state information is sent to the corresponding target measurement source to indicate the corresponding target measurement source to reset. If the measurement source is reset, the corresponding measurement source can continue to participate in the fusion. For measurement sources which are not reset completely, the time confidence is invalid and the measurement sources do not participate in fusion.
And S480, determining that the target measurement source is in a normal state. According to the embodiment of the invention, the abnormal measurement source is reset by judging the state of the target measurement source, so that the abnormal measurement source is positioned and recovered to be normal, and the fusion positioning robustness of the abnormal measurement source is improved.
Fig. 5 is a schematic structural diagram of a fusion positioning device according to an embodiment of the present invention. As shown in fig. 5, the apparatus includes:
a data obtaining module 510, configured to obtain odometer data and measurement source data of at least one measurement source when a fusion positioning event is triggered;
the data updating module 520 is configured to screen out a target measurement source with an effective time confidence according to the timestamp in the measurement source data, update target measurement source data corresponding to the target measurement source through odometer data, and generate current frame measurement source data of each path of target measurement source;
a merging participation judgment module 530, configured to judge whether a target measurement source corresponding to the current frame measurement source data participates in merging positioning according to the current frame measurement source data and the previous frame merging positioning result;
and the fusion positioning module 540 is configured to generate a current frame fusion positioning result based on the corresponding current frame measurement source data by using a setting filter for the target measurement source participating in fusion positioning.
Optionally, after the data obtaining module 510, the method further includes:
and the marking module is used for clustering the target measurement sources based on the current frame measurement source data of each path of target measurement source and adding clustering marks to each path of target measurement source according to the clustering result.
Optionally, the marking module includes:
the first measurement source determining unit is used for determining a first measurement source which is successfully clustered in the target measurement sources according to the clustering marks;
the difference determining unit is used for acquiring cluster center data of the first measurement source, and determining a first distance difference and a first course difference between the cluster center data and a first fusion positioning result of the current frame, wherein the first fusion positioning result of the current frame comprises position data and a course angle in the fusion positioning result of the current frame;
the time stamp determining unit is used for determining a starting time stamp of the fusion positioning deviation according to the first distance difference value and the first course angle difference value;
the deviation state determining unit is used for recording the duration of the fusion positioning deviation based on the starting timestamp, and determining that the fusion positioning state is a deviation state when the duration meets a set condition;
and the filter resetting unit is used for determining that the first fusion positioning result of the current frame triggers the filter to be reset when the fusion positioning state is a deviated state.
Optionally, after the filter resetting unit, the method further includes:
and the average value updating subunit is used for acquiring the average value of the current frame measurement source data corresponding to each first measurement source, and updating the state quantity of the set filter according to the average value so as to realize filter resetting.
Optionally, the marking module further includes:
the timing unit is used for determining a second measurement source according to the timestamp corresponding to the target measurement source and timing by taking the timestamp corresponding to the second measurement source as a timing starting point if the first measurement source which is successfully clustered does not exist in the target measurement sources according to the clustering marks;
and the state quantity updating unit is used for updating the state quantity of the set filter according to the current frame measurement source data corresponding to the second measurement source to realize filter resetting if the first measurement source which is successfully clustered does not exist within the set timeout time.
Optionally, after the filter resetting unit, the method further includes:
the positioning result updating unit is used for updating the previous frame of fusion positioning result according to the odometer data to obtain a current frame of fusion positioning updating result;
and the prediction updating unit is used for predicting and updating the state quantity of the set filter and the variance of the prediction error according to the current frame fusion positioning updating result and the previous frame fusion positioning result.
Optionally, the prediction updating unit includes:
the weight determining subunit is used for obtaining the weight of each path of target measurement source according to the confidence coefficient and the preset weight in the current frame measurement source data participating in the fusion positioning;
the observation quantity determining subunit is used for acquiring the observation quantity of the set filter according to the weight of each path of target measurement source and the corresponding current frame measurement source data;
and the fusion positioning result acquisition subunit is used for generating a current frame fusion positioning result according to the acquired observed quantity of the set filter and the predicted and updated state quantity.
Optionally, after the observation amount determining subunit, the method further includes:
and the variance updating subunit is used for observing and updating the variance of the observation error of the set filter according to the variance of the prediction error after the prediction updating and the mean value of the observation error of the set filter.
Optionally, after the observation quantity determining subunit, the method further includes:
the confidence determining subunit is used for acquiring the fusion positioning confidence of the current frame fusion positioning result according to the weight and the confidence of each target measurement source participating in the fusion;
and the fusion positioning state determining subunit is used for determining the fusion positioning state of the current frame fusion positioning result according to the fusion positioning reliability.
Optionally, the filter resetting unit further includes:
and the pure motion estimation state subunit is used for taking the current frame fusion positioning updating result as the current frame fusion positioning result and updating the fusion positioning state into the pure motion estimation state if no target measurement source participates in the fusion positioning.
Optionally, after the pure motion estimation state subunit, the method further includes:
and the triggering condition judging subunit is used for acquiring the duration or the duration distance of the pure motion estimation state and judging whether the filter is triggered to reset by the current frame fusion positioning result or not based on the duration or the duration distance.
Optionally, the data updating module 520 includes:
the time stamp difference determining unit is used for determining a time stamp difference according to the current time stamp and the time stamp in the target measurement source data;
and the current frame measurement source generating unit is used for generating a compensation coefficient according to the timestamp difference, the odometer data and the vehicle wheel base, updating target measurement source data corresponding to each target measurement source according to the compensation coefficient and generating current frame measurement source data of each target measurement source.
Optionally, the fusion participation determining module 530 includes:
the second difference determining unit is used for determining a second distance difference and a second course angle difference of the current frame measurement source data of each path of target measurement source and the previous frame fusion positioning result;
the threshold value determining unit is used for acquiring a first group of preset threshold values and a second group of preset threshold values, wherein the second group of threshold values are larger than the first group of threshold values, and each group of threshold values comprises a distance threshold value and a heading threshold value;
the participating fusion positioning determining unit is used for determining that the corresponding target measurement source participates in fusion positioning when the second distance difference value and the second course angle difference value of each path of target measurement source are smaller than the first group of threshold values;
the comparison unit is used for comparing the second distance difference and the second course angle difference with a second group of threshold values when the second distance difference or the second course angle difference is equal to or larger than the first group of threshold values; and if the second distance difference and the second heading angle difference are both smaller than a second group of threshold values, determining that the corresponding target measurement source participates in the fusion positioning.
Optionally, after the fusing and positioning module 540, the method further includes:
the third difference determining unit is used for determining a third distance difference and a third course angle difference between the current frame measurement source data of each path of target measurement source and a current frame second fusion positioning result, wherein the current frame second fusion positioning result comprises position data and a course angle in the current frame fusion positioning result;
the abnormal state determining unit is used for judging whether each path of target measurement source is in an abnormal state or not according to the third distance difference value and the third course angle difference value; if so, sending abnormal state information to the corresponding target measurement source to indicate the corresponding target measurement source to reset.
The fusion positioning device provided by the embodiment of the invention can execute the fusion positioning method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
FIG. 6 illustrates a schematic block diagram of a vehicle 10 that may be used to implement an embodiment of the present invention. As shown in fig. 6, the vehicle 10 includes at least one processor 11, and a memory communicatively connected to the at least one processor 11, such as a Read Only Memory (ROM)12, a Random Access Memory (RAM)13, and the like, wherein the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM)12 or the computer program loaded from a storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the vehicle 10 can also be stored. The processor 11, the ROM 12, and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the vehicle 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, or the like; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the vehicle 10 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, or the like. The processor 11 performs the various methods and processes described above, such as a fusion positioning method.
In some embodiments, a fusion localization method may be implemented as a computer program tangibly embodied in a computer-readable storage medium, such as storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed on the vehicle 10 via the ROM 12 and/or the communication unit 19. When loaded into RAM 13 and executed by processor 11, may perform one or more of the steps of a fusion positioning method described above. Alternatively, in other embodiments, the processor 11 may be configured to perform a fused localization method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for implementing the methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be performed. A computer program can execute entirely on a machine, partly on a machine, as a stand-alone software package partly on a machine and partly on a remote machine or entirely on a remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. A computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described herein may be implemented on a vehicle having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user may provide input to the vehicle. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present invention may be executed in parallel, sequentially, or in different orders, and are not limited herein as long as the desired result of the technical solution of the present invention can be achieved.
The above-described embodiments should not be construed as limiting the scope of the invention. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A fusion localization method, comprising:
acquiring odometer data and measurement source data of at least one path of measurement source when a fusion positioning event is triggered;
screening out target measurement sources with effective time confidence according to timestamps in the measurement source data, updating target measurement source data corresponding to the target measurement sources through the mileage data, and generating current frame measurement source data of each path of target measurement sources;
judging whether a target measurement source corresponding to the current frame measurement source data participates in fusion positioning or not according to the current frame measurement source data and the previous frame fusion positioning result;
and for the target measurement source participating in fusion positioning, generating a current frame fusion positioning result by adopting a set filter based on the corresponding current frame measurement source data.
2. The method of claim 1, after generating the current frame measurement source data of each target measurement source, further comprising:
clustering the target measurement sources based on the current frame measurement source data of each path of target measurement source, and adding a clustering mark to each path of target measurement source according to a clustering result.
3. The method of claim 2, further comprising:
determining a first measurement source which is successfully clustered in the target measurement sources according to the clustering marks;
acquiring cluster center data of the first measurement source, and determining a first distance difference value and a first course difference value of the cluster center data and a current frame first fusion positioning result, wherein the current frame first fusion positioning result comprises position data and a course angle in the current frame fusion positioning result;
determining a starting timestamp fusing the positioning deviation according to the first distance difference and the first course angle difference;
recording the duration of the fusion positioning deviation based on the starting timestamp, and determining that the fusion positioning state is a deviation state when the duration meets a set condition;
and when the fusion positioning state is a deviation state, determining that the first fusion positioning result of the current frame triggers the filter to reset.
4. The method of claim 2, further comprising:
if it is determined that a first measurement source which is successfully clustered does not exist in the target measurement sources according to the clustering marks, determining a second measurement source according to a timestamp corresponding to the target measurement source, and timing by taking the timestamp corresponding to the second measurement source as a timing starting point;
and if the first measurement source which is successfully clustered does not exist within the set timeout time, updating the state quantity of the set filter according to the current frame measurement source data corresponding to the second measurement source so as to realize filter resetting.
5. The method of claim 4, further comprising, after the filter is reset:
updating the previous frame fusion positioning result according to the odometer data to obtain a current frame fusion positioning updating result;
and predicting and updating the state quantity of the set filter and the variance of the prediction error according to the current frame fusion positioning updating result and the previous frame fusion positioning result.
6. The method of claim 5, wherein the generating the current frame fusion localization result based on the corresponding current frame measurement source data by using the setting filter comprises:
acquiring the weight of each path of target measurement source according to the confidence coefficient and the preset weight in the current frame measurement source data participating in the fusion positioning;
acquiring the observed quantity of the set filter according to the weight of each path of target measurement source and the corresponding current frame measurement source data;
and generating a current frame fusion positioning result according to the obtained observed quantity of the set filter and the predicted and updated state quantity.
7. The method of claim 5, further comprising:
and if no target measurement source participates in the fusion positioning, taking the current frame fusion positioning updating result as the current frame fusion positioning result, and updating the fusion positioning state into a pure motion estimation state.
8. A fusion positioning device, comprising:
the data acquisition module is used for acquiring odometer data and measurement source data of at least one path of measurement source when the fusion positioning event is triggered;
the data updating module is used for screening out target measurement sources with effective time confidence according to the timestamps in the measurement source data, updating the target measurement source data corresponding to the target measurement sources through the mileage data and generating current frame measurement source data of each path of target measurement sources;
the participation fusion judging module is used for judging whether a target measurement source corresponding to the current frame measurement source data participates in fusion positioning or not according to the current frame measurement source data and the previous frame fusion positioning result;
and the fusion positioning module is used for generating a current frame fusion positioning result for the target measurement source participating in fusion positioning based on the corresponding current frame measurement source data by adopting a setting filter.
9. A vehicle, characterized in that the vehicle comprises:
at least one measurement source for providing measurement source data while the vehicle is in motion;
the odometer is used for providing odometer data in the running process of the vehicle;
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the fusion localization method of any of claims 1-7.
10. A computer-readable storage medium having stored thereon computer instructions for causing a processor to execute the fusion positioning method of any one of claims 1-7.
CN202210540746.6A 2022-05-17 2022-05-17 Fusion positioning method, device, vehicle and storage medium Active CN114964270B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210540746.6A CN114964270B (en) 2022-05-17 2022-05-17 Fusion positioning method, device, vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210540746.6A CN114964270B (en) 2022-05-17 2022-05-17 Fusion positioning method, device, vehicle and storage medium

Publications (2)

Publication Number Publication Date
CN114964270A true CN114964270A (en) 2022-08-30
CN114964270B CN114964270B (en) 2024-04-26

Family

ID=82982565

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210540746.6A Active CN114964270B (en) 2022-05-17 2022-05-17 Fusion positioning method, device, vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN114964270B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115856976A (en) * 2023-02-27 2023-03-28 智道网联科技(北京)有限公司 Fusion positioning method and device for automatic driving vehicle and electronic equipment

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140372026A1 (en) * 2011-09-14 2014-12-18 Trusted Positioning Inc. Method and apparatus for navigation with nonlinear models
JP2017125820A (en) * 2016-01-15 2017-07-20 三菱電機株式会社 Information processing apparatus, information processing method, and information processing program
CN109099920A (en) * 2018-07-20 2018-12-28 重庆长安汽车股份有限公司 Sensor target accurate positioning method based on Multisensor association
CN109579844A (en) * 2018-12-04 2019-04-05 电子科技大学 Localization method and system
WO2019114807A1 (en) * 2017-12-15 2019-06-20 蔚来汽车有限公司 Multi-sensor target information fusion
CN110631574A (en) * 2018-06-22 2019-12-31 北京自动化控制设备研究所 inertia/odometer/RTK multi-information fusion method
CN111121755A (en) * 2020-01-02 2020-05-08 广东博智林机器人有限公司 Multi-sensor fusion positioning method, device, equipment and storage medium
CN111141273A (en) * 2019-12-18 2020-05-12 无锡北微传感科技有限公司 Combined navigation method and system based on multi-sensor fusion
CN113327344A (en) * 2021-05-27 2021-08-31 北京百度网讯科技有限公司 Fusion positioning method, device, equipment, storage medium and program product
US20220018962A1 (en) * 2020-07-16 2022-01-20 Beijing Tusen Weilai Technology Co., Ltd. Positioning method and device based on multi-sensor fusion

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140372026A1 (en) * 2011-09-14 2014-12-18 Trusted Positioning Inc. Method and apparatus for navigation with nonlinear models
JP2017125820A (en) * 2016-01-15 2017-07-20 三菱電機株式会社 Information processing apparatus, information processing method, and information processing program
WO2019114807A1 (en) * 2017-12-15 2019-06-20 蔚来汽车有限公司 Multi-sensor target information fusion
CN110631574A (en) * 2018-06-22 2019-12-31 北京自动化控制设备研究所 inertia/odometer/RTK multi-information fusion method
CN109099920A (en) * 2018-07-20 2018-12-28 重庆长安汽车股份有限公司 Sensor target accurate positioning method based on Multisensor association
CN109579844A (en) * 2018-12-04 2019-04-05 电子科技大学 Localization method and system
CN111141273A (en) * 2019-12-18 2020-05-12 无锡北微传感科技有限公司 Combined navigation method and system based on multi-sensor fusion
CN111121755A (en) * 2020-01-02 2020-05-08 广东博智林机器人有限公司 Multi-sensor fusion positioning method, device, equipment and storage medium
US20220018962A1 (en) * 2020-07-16 2022-01-20 Beijing Tusen Weilai Technology Co., Ltd. Positioning method and device based on multi-sensor fusion
CN113327344A (en) * 2021-05-27 2021-08-31 北京百度网讯科技有限公司 Fusion positioning method, device, equipment, storage medium and program product

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ZHAODONG LI: "Design of Intelligent Mobile Robot Positioning Algorithm Based on IMU/Odometer/Lidar", 2019 INTERNATIONAL CONFERENCE ON SENSING, DIAGNOSTICS, PROGNOSTICS, AND CONTROL, 18 September 2020 (2020-09-18) *
陈允芳;叶泽田;: "基于多传感器融合的车载移动测图系统研究", 测绘通报, no. 01, 25 January 2007 (2007-01-25) *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115856976A (en) * 2023-02-27 2023-03-28 智道网联科技(北京)有限公司 Fusion positioning method and device for automatic driving vehicle and electronic equipment

Also Published As

Publication number Publication date
CN114964270B (en) 2024-04-26

Similar Documents

Publication Publication Date Title
US10422658B2 (en) Method, fusion filter, and system for fusing sensor signals with different temporal signal output delays into a fusion data set
US9360323B2 (en) Systems and methods for estimating movements of a vehicle using a mobile device
CN109444932B (en) Vehicle positioning method and device, electronic equipment and storage medium
US11815355B2 (en) Method and system for combining sensor data
EP3321631B1 (en) A inertial and terrain based navigation system
US11120562B2 (en) Posture estimation method, posture estimation apparatus and computer readable storage medium
CN111077549B (en) Position data correction method, apparatus and computer readable storage medium
CN107884800B (en) Combined navigation data resolving method and device for observation time-lag system and navigation equipment
CN110715659A (en) Zero-speed detection method, pedestrian inertial navigation method, device and storage medium
CN109059907A (en) Track data processing method, device, computer equipment and storage medium
JP2017194460A (en) Navigation system and method for error correction
CN113959457B (en) Positioning method and device for automatic driving vehicle, vehicle and medium
CN114964270B (en) Fusion positioning method, device, vehicle and storage medium
CN111721299B (en) Real-time positioning time synchronization method and device
CN112102418B (en) Calibration method, calibration device, electronic equipment and storage medium
EP3060943B1 (en) Improved system for post processing gnss/ins measurement data and camera image data
CN111183464B (en) System and method for estimating saturation flow of signal intersection based on vehicle trajectory data
JP2019082328A (en) Position estimation device
CN115752471A (en) Sensor data processing method and device and computer readable storage medium
CN114199236A (en) Positioning data processing method and device, electronic equipment and automatic driving vehicle
CN115037703A (en) Data processing method, data processing apparatus, computer storage medium, and computer program product
CN111191734A (en) Sensor data fusion method, device, equipment and storage medium
TWI636236B (en) Method for determining states of a system by means of an estimation filter, device for determining a position of an object and unmanned aerial vehicle
CN112394190B (en) Method and device for determining angular velocity, storage medium, and electronic device
CN113055598B (en) Orientation data compensation method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant