CN115824223A - Indoor and outdoor seamless positioning method based on multi-source fusion - Google Patents

Indoor and outdoor seamless positioning method based on multi-source fusion Download PDF

Info

Publication number
CN115824223A
CN115824223A CN202211652953.7A CN202211652953A CN115824223A CN 115824223 A CN115824223 A CN 115824223A CN 202211652953 A CN202211652953 A CN 202211652953A CN 115824223 A CN115824223 A CN 115824223A
Authority
CN
China
Prior art keywords
positioning
time
outdoor
indoor
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211652953.7A
Other languages
Chinese (zh)
Inventor
宋晗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Guangyu Technology Co ltd
Original Assignee
Zhejiang Guangyu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Guangyu Technology Co ltd filed Critical Zhejiang Guangyu Technology Co ltd
Priority to CN202211652953.7A priority Critical patent/CN115824223A/en
Publication of CN115824223A publication Critical patent/CN115824223A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention relates to the technical field of indoor and outdoor positioning, and discloses an indoor and outdoor seamless positioning method based on multi-source fusion, which comprises the following steps: s1, outdoor positioning: performing outdoor positioning by adopting GNSS and INS respectively; s2, indoor positioning: indoor positioning is carried out by adopting UWB and PDR respectively; s3, multi-source data fusion: and carrying out data fusion on the outdoor positioning coordinate and the indoor positioning coordinate. The invention respectively positions outdoors through a global navigation satellite system and an inertial navigation system, and then performs multi-source fusion processing on two groups of data to obtain a combined outdoor coordinate position; performing room respectively through UWB and PDR, and performing multi-source fusion processing on the two groups of data to obtain a combined outdoor coordinate position and a combined outdoor coordinate position; and then the combined outdoor coordinate position and the combined outdoor coordinate position are subjected to multi-source fusion processing to obtain final data, the positioning precision is high, and continuous and accurate positioning can be realized in an environment with poor and unstable signals.

Description

Indoor and outdoor seamless positioning method based on multi-source fusion
Technical Field
The invention relates to the technical field of indoor and outdoor positioning, in particular to an indoor and outdoor seamless positioning method based on multi-source fusion.
Background
With the advent of the information age, the role that services based on location information play in various application scenarios is increasingly important; in an outdoor open scene, the GNSS can provide real-time, reliable and stable navigation and positioning services; however, in an urban environment, due to the fact that scenes are complex, the problems that the number of visible satellites in urban canyons is small, multipath effects exist, signals of the scenes such as tunnels are unlocked, and the like exist, positioning accuracy is poor, reliability is low, and the requirements of positioning in a full scene, real time, high accuracy and high reliability are difficult to meet. In addition, in an indoor environment which accounts for 80% of the daily life time of human beings, the GNSS positioning accuracy is sharply reduced due to the shielding of buildings and the influence of multipath effects, and the requirement of indoor position service cannot be met; therefore, how to accurately position by means of the existing positioning technology is increasingly critical, and the multi-source fusion positioning technology is correspondingly proposed; the method adopts an information fusion method to process satellite navigation positioning, wireless sensor positioning and other auxiliary positioning technologies, and finally obtains reliable, accurate and stable positioning service.
Chinese patent discloses an indoor and outdoor seamless positioning method (No. CN 114993317A) based on multi-source fusion, and the patent technology can effectively reduce seamless positioning errors of multi-source heterogeneous data; variable nodes in the fixed rate factor graph method are generated at a fixed rate and are not influenced by measurement values; asynchronous measurements may be associated with corresponding variable nodes in the graph for optimization; the method avoids the time span reduction and frequent optimization of the traditional factor graph, improves the estimation precision, reduces the calculated amount, but is difficult to realize accurate positioning in the environment with poor and unstable signals.
Disclosure of Invention
The invention aims to provide an indoor and outdoor seamless positioning method based on multi-source fusion so as to solve the problems in the background technology.
In order to achieve the purpose, the invention provides the following technical scheme:
an indoor and outdoor seamless positioning method based on multi-source fusion comprises the following steps:
s1, outdoor positioning: under the outdoor condition, respectively carrying out outdoor positioning by adopting a Global Navigation Satellite System (GNSS) and an Inertial Navigation System (INS) to obtain an RTK positioning coordinate and an INS positioning coordinate, then carrying out data fusion processing on data in the RTK positioning coordinate and the INS positioning coordinate to obtain a combined outdoor coordinate position, and continuously correcting the previous moment by adopting the current moment data to obtain an outdoor positioning coordinate so as to realize outdoor multi-source fusion positioning; the GNSS positioning method adopts a real-time dynamic differential method (RTK) to perform positioning;
s2, indoor positioning: under the indoor condition, indoor positioning is respectively carried out by adopting an Ultra Wide Band (UWB) and a Pedestrian Dead Reckoning (PDR) to obtain a UWB positioning coordinate and a PDR positioning coordinate, data fusion processing is carried out on data in the UWB positioning coordinate and the PDR positioning coordinate to obtain a combined outdoor coordinate position, and the current moment data is adopted to continuously correct the last moment to obtain an indoor positioning coordinate, so that indoor multi-source fusion positioning is realized;
s3, multi-source data fusion: and performing data fusion again on the fused outdoor positioning coordinate data and the fused indoor positioning coordinate data, and continuously correcting the previous moment by adopting the current moment data to realize the multi-source fused indoor and outdoor seamless positioning.
As a still further scheme of the invention: the specific method for data fusion in the step S1 is as follows:
s11, defining a system state vector X n =[a n ,Δa n ] T Establishing a linear system state equation model:
X n =AX n-1 +W n (1)
namely that
Figure BDA0004006907960000021
In the above formulae (1) and (2), a n Is the estimated course angle of RTK positioning at n time, delta a is the estimated course angle difference of INS positioning from n-1 to n time, X n And X n-1 Representing the estimated state vector at time n and n-1, respectively, A n For the estimated state transition matrix of the system from n-1 to n times, W n Estimating a state noise vector of; thereby calculating the estimated coordinates of the current moment;
s12, taking an observation vector Y n =[a′ n ,Δa′ n ] T Establishing an observation equation model:
Y n =H n Y n-1 +V n (3)
in the above formula (3), a' n Is an observed course angle of time n, delta a' n For observed differences from n-1 to n times, Y n And Y n-1 The observation vectors at times n and n-1, respectively, H n An observed moment for n momentsArray, V n Calculating the observation coordinate of the current moment for observing the noise vector;
and S13, combining the estimated coordinates with the observed coordinates to obtain an optimal estimated value, thereby obtaining the fused coordinates.
As a still further scheme of the invention: the INS positioning method in the step S1 is as follows:
setting an initial velocity v of a target 0 Initial position s 0 Initial angle theta 0 According to the distance s at time n n And angle theta n Calculating the formula:
Figure BDA0004006907960000031
in the above formula (4), s 0 To move the initial position of the target, theta 0 As an initial angle of the moving object, a n Acceleration of moving object at time n, w n The angular speed of the moving target at the moment n is obtained;
and (5) bringing the data obtained at the previous moment (i-1) into a calculation formula of the current moment (i), continuously integrating, and obtaining the current coordinate position of the target.
As a still further scheme of the invention: the calculation formula of the positioning coordinate position in the step S2 is as follows:
Figure BDA0004006907960000032
in the above formula (5), x i ,y j Representing the coordinate position, x, of the moving object at time i 0 ,y 0 Indicating the initial coordinate position of the moving object,/ i Represents the distance of movement of the moving object between the times i and i-1, theta i Is the i moment course angle; wherein the initial position (x) 0 ,y 0 ) Initial course angle theta obtained by UWB positioning 0 Obtained by PDR positioning.
As a still further scheme of the invention: the specific method for data fusion in the step S2 is as follows:
s21, defining a system state vector Y = [ x ] hi ,y hi ,θ hi ] T Establishing a nonlinear system state equation model:
Figure BDA0004006907960000033
Figure BDA0004006907960000034
in the above formulae (6) and (7), x hi And y hi Representing the position of the moving object at time i, obtained by PDR positioning hi Represents the moving distance theta between the i and i-1 moments of the moving target obtained by the PDR positioning hi Represents the heading angle of the moving target obtained by PDR positioning at the moment i, wi- 1 Representing system state noise, wherein delta theta represents the course angle variation of the moving target obtained by PDR positioning between the time i and the time i-1; thereby calculating the estimated coordinates of the current moment;
s22, taking a system state vector Y = [ x ] ci, y ci ,l ci, Δθ ci ,θc i ] T Establishing an observation equation model:
Figure BDA0004006907960000041
in the above formula (8), xc i And y ci Showing the observed position of the moving object at time i, lc i Represents the observed distance of movement, thetac, of the moving object between times i and i-1 i Representing the observation course angle of the moving target at the time i, vi representing observation noise, and Delta theta representing the observation course angle variation of the moving target between the time i and the time i-1, thereby calculating the observation coordinate at the current time;
and S23, combining the estimated coordinates with the observed coordinates to obtain an optimal estimated value, so as to obtain fused coordinates.
As a still further scheme of the invention: the specific method for data fusion in the step S3 is as follows:
s31, setting an initial state vector X t =[x,y,x P ,y P ,x G ,y G ] T Establishing a linear system state equation model:
Figure BDA0004006907960000042
in the above formula (9), (x) t ,y t ) Representing the estimated coordinate obtained by multi-source fusion positioning at the time t, (x) P ,y P ) Represents the estimated coordinates of the outdoor positioning at time t, (x) G ,y G ) Representing estimated coordinates, w, obtained by indoor positioning at time t p And w G Respectively representing the weight values of the multi-source fusion positioning results of the outdoor positioning and the indoor positioning; thereby calculating the estimated coordinates of the current moment;
s31, taking an observation vector X t =[x′ t ,y′ t ,x′ p ,y′ p ,x′ G ,y′ p ] T Establishing an observation equation model:
Figure BDA0004006907960000043
(x ') in the above formula (10)' t ,y′ t ) Represents an observation coordinate (x ') obtained by multi-source fusion positioning at the time t' pt ,y′ pt ) Represents the estimated coordinates obtained by outdoor positioning at the time t, (x' Gt ,y′ pt ) Representing the estimated coordinates obtained by indoor positioning at the time t, thereby calculating the observation coordinates at the current time;
and S33, combining the estimated coordinates with the observed coordinates to obtain optimal estimated values, thereby obtaining fused coordinates.
Compared with the prior art, the invention has the beneficial effects that:
the invention respectively positions outdoors through a global navigation satellite system and an inertial navigation system, and then performs multi-source fusion processing on two groups of data to obtain a combined outdoor coordinate position; performing room respectively through UWB and PDR, and performing multi-source fusion processing on the two groups of data to obtain a combined outdoor coordinate position and a combined outdoor coordinate position; and then the combined outdoor coordinate position and the combined outdoor coordinate position are subjected to multi-source fusion processing to obtain final data, the positioning precision is high, and continuous and accurate positioning can be realized in an environment with poor and unstable signals.
Detailed Description
In the embodiment of the invention, an indoor and outdoor seamless positioning method based on multi-source fusion comprises the following steps:
s1, outdoor positioning: under the outdoor condition, respectively carrying out outdoor positioning by adopting a Global Navigation Satellite System (GNSS) and an Inertial Navigation System (INS) to obtain an RTK positioning coordinate and an INS positioning coordinate, then carrying out data fusion processing on data in the RTK positioning coordinate and the INS positioning coordinate to obtain a combined outdoor coordinate position, and continuously correcting the last moment by adopting current moment data to obtain an outdoor positioning coordinate so as to reduce course angle measurement errors and realize outdoor multi-source fusion positioning; the GNSS positioning method adopts a real-time dynamic differential method (RTK) to perform positioning;
s2, indoor positioning: under the indoor condition, indoor positioning is respectively carried out by adopting an Ultra Wide Band (UWB) and a Pedestrian Dead Reckoning (PDR) to obtain a UWB positioning coordinate and a PDR positioning coordinate, data fusion processing is carried out on data in the UWB positioning coordinate and the PDR positioning coordinate to obtain a combined outdoor coordinate position, and the current moment data is adopted to continuously correct the last moment to obtain an indoor positioning coordinate, so that indoor multi-source fusion positioning is realized;
s3, multi-source data fusion: and performing data fusion again on the fused outdoor positioning coordinate data and the fused indoor positioning coordinate data, and continuously correcting the previous moment by adopting the current moment data to realize the multi-source fused indoor and outdoor seamless positioning.
Preferably, the specific method of data fusion in the step S1 is as follows:
s11, defining a system state vector X n =[a n ,Δa n ] T Establishing a linear system state equation model:
X n =AX n-1 +W n (1)
namely, it is
Figure BDA0004006907960000051
In the above formulae (1) and (2), a n Is the estimated course angle of RTK positioning at n time, delta a is the estimated course angle difference of INS positioning from n-1 to n time, X n And X n-1 Representing the estimated state vector at time n and n-1, respectively, A n For the estimated state transition matrix of the system from n-1 to n times, W n Estimating a state noise vector of; thereby calculating the estimated coordinates of the current moment;
s12, taking an observation vector Y n =[a′ n ,Δa′ n ] T Establishing an observation equation model:
Y n =H n Y n-1 +V n (3)
in the above formula (3), a' n Is an observed course angle of time n, delta a' n For observed differences from n-1 to n times, Y n And Y n-1 The observation vectors at times n and n-1, respectively, H n Is an observation matrix at time n, V n Calculating the observation coordinate of the current moment for observing the noise vector;
and S13, combining the estimated coordinates with the observed coordinates to obtain an optimal estimated value, thereby obtaining the fused coordinates. Preferably, the INS positioning method in step S1 is as follows:
setting an initial velocity v of a target 0 Initial position s 0 Initial angle theta 0 According to the distance s at time n n And angle theta n Calculating the formula:
Figure BDA0004006907960000061
in the above formula (4), s 0 To moveInitial position of moving object, theta 0 As an initial angle of the moving object, a n Acceleration of moving object at time n, w n The angular speed of the moving target at the moment n is obtained;
and (5) bringing the data obtained at the previous moment (i-1) into a calculation formula of the current moment (i), continuously integrating, and obtaining the current coordinate position of the target.
Preferably, the calculation formula of the positioning coordinate position in the step S2 is as follows:
Figure BDA0004006907960000062
in the above formula (5), x i ,y j Representing the coordinate position, x, of the moving object at time i 0 ,y 0 Indicating the initial coordinate position of the moving object,/ i Represents the distance of movement of the moving object between the times i and i-1, theta i Is the i moment course angle; wherein the initial position (x) 0 ,y 0 ) Initial course angle theta obtained by UWB positioning 0 Obtained by PDR positioning.
Preferably, the specific method of data fusion in the step S2 is as follows:
s21, define the system state vector Y = [ x ] hi ,y hi ,θ hi ] T Establishing a nonlinear system state equation model:
Figure BDA0004006907960000071
Figure BDA0004006907960000072
in the above formulae (6) and (7), x hi And y hi Representing the position of the moving object at time i, obtained by PDR positioning hi Represents the moving distance theta between the i and i-1 moments of the moving target obtained by the PDR positioning hi Representing the heading angle W of the moving target at the moment i obtained by PDR positioning i-1 Representing system state noise, wherein delta theta represents the course angle variation of a moving target obtained by PDR positioning between the time i and the time i-1; thereby calculating the estimated coordinates of the current moment;
s22, taking a system state vector Y = [ x ] ci ,y ci ,l ci, Δθ ci ,θc i ] T Establishing an observation equation model:
Figure BDA0004006907960000073
in the above formula (8), x ci And y ci Indicating the observed position of the moving object at time i,/ ci Representing the observed distance of movement, theta, of the moving object between times i and i-1 ci Indicating the observed course angle, V, of the moving object at time i i Representing observation noise, and delta theta representing the variation of the observation course angle of the moving target between the time i and the time i-1, so as to calculate the observation coordinate of the current time;
and S23, combining the estimated coordinates with the observed coordinates to obtain optimal estimated values, thereby obtaining fused coordinates.
Preferably, the specific method of data fusion in the step S3 is as follows:
s31, setting an initial state vector X t =[x,y,x P ,y P ,x G ,y G ] T Establishing a linear system state equation model:
Figure BDA0004006907960000074
in the above formula (9), (x) t ,y t ) Showing estimated coordinates obtained by multi-source fusion positioning at the time t, (xP, yP) showing estimated coordinates obtained by outdoor positioning at the time t, (x) G yG) represents the estimated coordinates of the indoor location at time t, w P And w G Respectively representing the weight values of the multi-source fusion positioning results of the outdoor positioning and the indoor positioning; thereby calculating the estimated coordinates of the current moment;
s31, taking an observation vector X t =[x′ t ,y′ t ,x′ p ,y′ p ,x′ G ,y′ p ] T Establishing an observation equation model:
Figure BDA0004006907960000081
(x ') in the above formula (10)' t ,y′ t ) Represents an observation coordinate (x ') obtained by multi-source fusion positioning at the time t' pt ,y′ pt ) Represents the estimated coordinates obtained by outdoor positioning at the time t, (x' Gt ,y′ pt ) Representing the estimated coordinates obtained by indoor positioning at the time t, thereby calculating the observation coordinates at the current time;
and S33, combining the estimated coordinates with the observed coordinates to obtain an optimal estimated value, so as to obtain fused coordinates.
The above embodiments are only preferred embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art should be considered to be within the scope of the present invention, and the technical solutions and the inventive concepts thereof according to the present invention should be equally replaced or changed within the scope of the present invention.

Claims (6)

1. An indoor and outdoor seamless positioning method based on multi-source fusion is characterized by comprising the following steps:
s1, outdoor positioning: under the outdoor condition, respectively carrying out outdoor positioning by adopting a Global Navigation Satellite System (GNSS) and an Inertial Navigation System (INS) to obtain an RTK positioning coordinate and an INS positioning coordinate, then carrying out data fusion processing on data in the RTK positioning coordinate and the INS positioning coordinate to obtain a combined outdoor coordinate position, and continuously correcting the previous moment by adopting the current moment data to obtain an outdoor positioning coordinate so as to realize outdoor multi-source fusion positioning; the GNSS positioning method adopts a real-time dynamic differential method (RTK) to perform positioning;
s2, indoor positioning: under the indoor condition, indoor positioning is respectively carried out by adopting an Ultra Wide Band (UWB) and a Pedestrian Dead Reckoning (PDR) to obtain a UWB positioning coordinate and a PDR positioning coordinate, data fusion processing is carried out on data in the UWB positioning coordinate and the PDR positioning coordinate to obtain a combined outdoor coordinate position, and the current moment data is adopted to continuously correct the last moment to obtain an indoor positioning coordinate, so that indoor multi-source fusion positioning is realized;
s3, multi-source data fusion: and performing data fusion again on the fused outdoor positioning coordinate data and the fused indoor positioning coordinate data, and continuously correcting the previous moment by adopting the current moment data to realize the multi-source fused indoor and outdoor seamless positioning.
2. The indoor and outdoor seamless positioning method based on multi-source fusion according to claim 1, wherein the specific method of data fusion in the step S1 is as follows:
s11, defining a system state vector X n =[a n ,Δa n ] T Establishing a linear system state equation model:
X n =AX n-1 +W n (1)
namely, it is
Figure FDA0004006907950000011
In the above formulae (1) and (2), a n Is the estimated course angle of RTK positioning at n time, delta a is the estimated course angle difference of INS positioning from n-1 to n time, X n And X n-1 Representing the estimated state vector at time n and n-1, respectively, A n For the estimated state transition matrix of the system from n-1 to n times, W n Estimating a state noise vector of; thereby calculating the estimated coordinates of the current moment;
s12, taking an observation vector Y n =[a n ,Δa n ] T Establishing an observation equation model:
Y n =H n Y n-1 +V n (3)
the above formulas (A), (B)3) In (a) n Is the observed heading angle, Δ a, at time n n For observed differences from n-1 to n times, Y n And Y n-1 The observation vectors at times n and n-1, respectively, H n Is an observation matrix at time n, V n Calculating the observation coordinate of the current moment for observing the noise vector;
and S13, combining the estimated coordinates with the observed coordinates to obtain an optimal estimated value, thereby obtaining the fused coordinates.
3. The indoor and outdoor seamless positioning method based on multi-source fusion according to claim 1, wherein the INS positioning method in the S1 step is as follows:
setting an initial velocity v of a target 0 Initial position s 0 Initial angle theta 0 According to the distance s at time n n And angle theta n Calculating the formula:
Figure FDA0004006907950000021
in the above formula (4), s 0 To move the initial position of the target, theta 0 As an initial angle of the moving object, a n Acceleration of moving object at time n, w n The angular speed of the moving target at the moment n;
and (5) bringing the data obtained at the previous moment (i-1) into a calculation formula of the current moment (i), continuously integrating, and obtaining the current coordinate position of the target.
4. The multi-source fusion-based indoor and outdoor seamless positioning method according to claim 1, wherein a calculation formula of the positioning coordinate position in the step S2 is as follows:
Figure FDA0004006907950000022
in the above formula (5), x i ,y j Representing the coordinate position, x, of the moving object at time i 0 ,y 0 Indicating the initial coordinate position of the moving object,/ i Represents the moving distance of the moving object between the time i and the time i-1, theta i Is the i moment course angle; wherein the initial position (x) 0 ,y 0 ) Initial course angle theta obtained by UWB positioning 0 Obtained by PDR positioning.
5. The indoor and outdoor seamless positioning method based on multi-source fusion according to claim 1, wherein the specific method of data fusion in the step S2 is as follows:
s21, defining a system state vector Y = [ x ] hi ,y hihi ] T Establishing a nonlinear system state equation model:
Figure FDA0004006907950000023
Figure FDA0004006907950000031
in the above formulae (6) and (7), x hi And y hi Represents the position of the moving object at time i obtained by PDR positioning hi Represents the moving distance theta between the i and i-1 moments of the moving target obtained by the PDR positioning hi Representing the heading angle W of the moving target at the moment i obtained by PDR positioning i-1 Representing system state noise, wherein delta theta represents the course angle variation of a moving target obtained by PDR positioning between the time i and the time i-1; thereby calculating the estimated coordinates of the current moment;
s22, taking a system state vector Y = [ x ] ci ,y ci ,l ci ,Δθ cici ] T Establishing an observation equation model:
Figure FDA0004006907950000032
in the above formula (8), x ci And y ci Indicating the observed position of the moving object at time i,/ ci Representing the observed distance of movement, theta, of the moving object between times i and i-1 ci Indicating the observed course angle, V, of the moving object at time i i Representing observation noise, and delta theta representing the variation of the observation course angle of the moving target between the time i and the time i-1, so as to calculate the observation coordinate of the current time;
and S23, combining the estimated coordinates with the observed coordinates to obtain optimal estimated values, thereby obtaining fused coordinates.
6. The indoor and outdoor seamless positioning method based on multi-source fusion according to claim 1, wherein the specific method of data fusion in the step S3 is as follows:
s31, setting an initial state vector X t =[x,y,x P ,y P ,x G ,y G ] T Establishing a linear system state equation model:
Figure FDA0004006907950000033
in the above formula (9), (x) t ,y t ) Representing the estimated coordinate obtained by multi-source fusion positioning at the time t, (x) P ,y P ) Represents the estimated coordinates of the outdoor positioning at time t, (x) G ,y G ) Representing estimated coordinates, w, obtained by indoor positioning at time t P And w G Respectively representing the weight values of the multi-source fusion positioning results of the outdoor positioning and the indoor positioning; thereby calculating the estimated coordinates of the current moment;
s31, taking an observation vector X t =[x′ t ,y′ t ,x′ p ,y′ p ,x′ G ,y′ p ] T Establishing an observation equation model:
Figure FDA0004006907950000041
in the above formula (10), (x) t ,y t ) Representing observation coordinates obtained by multi-source fusion positioning at the time t, (x) pt ,y pt ) Denotes the estimated coordinates obtained by outdoor positioning at time t, (x) Gt ,y pt ) Representing the estimated coordinates obtained by indoor positioning at the time t, thereby calculating the observation coordinates at the current time;
and S33, combining the estimated coordinates with the observed coordinates to obtain an optimal estimated value, so as to obtain fused coordinates.
CN202211652953.7A 2022-12-19 2022-12-19 Indoor and outdoor seamless positioning method based on multi-source fusion Pending CN115824223A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211652953.7A CN115824223A (en) 2022-12-19 2022-12-19 Indoor and outdoor seamless positioning method based on multi-source fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211652953.7A CN115824223A (en) 2022-12-19 2022-12-19 Indoor and outdoor seamless positioning method based on multi-source fusion

Publications (1)

Publication Number Publication Date
CN115824223A true CN115824223A (en) 2023-03-21

Family

ID=85517565

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211652953.7A Pending CN115824223A (en) 2022-12-19 2022-12-19 Indoor and outdoor seamless positioning method based on multi-source fusion

Country Status (1)

Country Link
CN (1) CN115824223A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116086448A (en) * 2023-04-12 2023-05-09 成都信息工程大学 UWB, IMU, GNSS fusion-based multi-scene seamless positioning method for unmanned equipment
CN117148406A (en) * 2023-10-30 2023-12-01 山东大学 Indoor and outdoor seamless elastic fusion positioning method, system, medium and equipment

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116086448A (en) * 2023-04-12 2023-05-09 成都信息工程大学 UWB, IMU, GNSS fusion-based multi-scene seamless positioning method for unmanned equipment
CN117148406A (en) * 2023-10-30 2023-12-01 山东大学 Indoor and outdoor seamless elastic fusion positioning method, system, medium and equipment
CN117148406B (en) * 2023-10-30 2024-01-30 山东大学 Indoor and outdoor seamless elastic fusion positioning method, system, medium and equipment

Similar Documents

Publication Publication Date Title
CN104121905B (en) Course angle obtaining method based on inertial sensor
CN115824223A (en) Indoor and outdoor seamless positioning method based on multi-source fusion
Fan et al. Data fusion for indoor mobile robot positioning based on tightly coupled INS/UWB
CN113311411B (en) Laser radar point cloud motion distortion correction method for mobile robot
CN110702091B (en) High-precision positioning method for moving robot along subway rail
CN107490378B (en) Indoor positioning and navigation method based on MPU6050 and smart phone
CN103983263A (en) Inertia/visual integrated navigation method adopting iterated extended Kalman filter and neural network
CN104713554A (en) Indoor positioning method based on MEMS insert device and android smart mobile phone fusion
CN113074739A (en) UWB/INS fusion positioning method based on dynamic robust volume Kalman
CN103471586B (en) The terminal combinations localization method that a kind of sensor is auxiliary and device
CN110412596A (en) A kind of robot localization method based on image information and laser point cloud
CN111982102B (en) BP-EKF-based UWB-IMU positioning method in complex environment
CN109737968B (en) Indoor fusion positioning method based on two-dimensional LiDAR and smart phone
CN109459028A (en) A kind of adaptive step estimation method based on gradient decline
CN116047567B (en) Deep learning assistance-based guard and inertial navigation combined positioning method and navigation method
CN114111802A (en) Pedestrian dead reckoning assisted UWB positioning method
CN104546391B (en) Gyro stabilizer for tactile sticks and complementary filtering method thereof
CN115767412A (en) Indoor positioning method integrating ultra-wideband and inertial measurement unit
CN113639722A (en) Continuous laser scanning registration auxiliary inertial positioning and attitude determination method
CN112862818A (en) Underground parking lot vehicle positioning method combining inertial sensor and multi-fisheye camera
CN113534226B (en) Indoor and outdoor seamless positioning algorithm based on smart phone scene recognition
CN116182855A (en) Combined navigation method of compound eye-simulated polarized vision unmanned aerial vehicle under weak light and strong environment
CN114485623B (en) Focusing distance camera-IMU-UWB fusion accurate positioning method
CN116242372A (en) UWB-laser radar-inertial navigation fusion positioning method under GNSS refusing environment
CN115540854A (en) Active positioning method, equipment and medium based on UWB assistance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination