CN116412816A - Navigation positioning method, positioning device and positioning system based on data fusion - Google Patents

Navigation positioning method, positioning device and positioning system based on data fusion Download PDF

Info

Publication number
CN116412816A
CN116412816A CN202111660614.9A CN202111660614A CN116412816A CN 116412816 A CN116412816 A CN 116412816A CN 202111660614 A CN202111660614 A CN 202111660614A CN 116412816 A CN116412816 A CN 116412816A
Authority
CN
China
Prior art keywords
positioning
data
visual
inertial navigation
gnss
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111660614.9A
Other languages
Chinese (zh)
Inventor
苗晓婷
王睿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SAIC Motor Corp Ltd
Original Assignee
SAIC Motor Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SAIC Motor Corp Ltd filed Critical SAIC Motor Corp Ltd
Priority to CN202111660614.9A priority Critical patent/CN116412816A/en
Publication of CN116412816A publication Critical patent/CN116412816A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The navigation positioning method based on data fusion is characterized by comprising the following steps: initializing a visual inertial navigation positioning coordinate system and a GNSS positioning coordinate system, so that visual inertial navigation positioning data are converted into corresponding GNSS positioning data expression forms, and alignment of the coordinate systems is realized; starting to acquire visual inertial navigation positioning data and GNSS positioning data; judging whether the GNSS positioning meets the preset condition or not; if yes, adopting the GNSS positioning data as a plane positioning result, otherwise, adopting the visual inertial navigation positioning data as a plane positioning result; taking the height data in the visual inertial navigation positioning data as a height positioning result and fusing and outputting the height positioning result and the plane positioning result; in addition, the application also provides a positioning device and a positioning system according to the method.

Description

Navigation positioning method, positioning device and positioning system based on data fusion
Technical Field
The embodiment of the application relates to the technical field of positioning navigation, in particular to a positioning method, a positioning device and a positioning system based on data fusion.
Background
The positioning navigation technology is widely applied to a plurality of fields of civil military and the like, and through continuous research and development, a plurality of different positioning systems are correspondingly generated, including a Global Positioning System (GPS), an Inertial Navigation System (INS), an astronomical navigation system (CNS) and the like.
The global navigation satellite system GNSS (Global Navigation Satellite System), including the global positioning system GPS (Global Positioning System), is widely used in all aspects of life to provide high-precision, all-weather, and wide-range positioning services. Although the positioning technology of the GNSS does not generate accumulated errors, the disadvantages of low update frequency and high interference are not neglected. With the advancement of civilian use, the satellite navigation error is larger, and it is difficult to provide more accurate position information, such as a handheld GPS as a typical GNSS device, and its positioning error can reach about 20 meters without correction, and the error after correction can reach 5-10 meters.
In the outdoor, the user equipment usually relies on assisted positioning to reduce errors besides satellite positioning, such as a mobile positioning system, and also relies on assistance of a cellular data base station besides GNSS; in an environment where the GNSS signals are blocked, the navigation positioning accuracy depending on the GNSS signals alone becomes worse.
Inertial navigation (IMU) systems utilize inertial sensors, including gyroscopes and accelerators, to achieve all-weather, global autonomous three-dimensional positioning. Inertial navigation measures the motion parameters of an object through an inertial sensor, and external signals are not required to be received, so that anti-interference is effectively achieved, but initial conditions and parameter setting of the instrument become more important, and accumulated errors and final positioning accuracy are related.
VIO (visual-inertial odometry) is known as visual odometer or visual inertial navigation. The motion state or pose change value of the equipment is estimated by acquiring the data of a camera and an IMU on the equipment. The VIO may take good advantage of the complementarity of the camera and IMU compared to the purely visual odometer VO (visual odometry). The IMU can estimate the scale information through the accelerometer, and the problem that the monocular VO cannot recover the scale is solved. Meanwhile, under the general condition, the vision sensor camera basically cannot work in scenes with poor textures, such as scenes with a large number of glass, white walls and other features; IMUs can provide relatively high accuracy displacement data in a short period of time, but can cause significant accumulated errors after prolonged use. Compared with a pure VO and an inertial navigation system, the VIO can integrate the two, the motion can be estimated by utilizing the data after IMU integration when the image quality difference vision sensor fails, the estimated value of a gyroscope can be utilized to replace the calculated pose when the VO rotation estimation is wrong under pure rotation motion, the static drift of the IMU can be corrected by means of invariance of the camera estimated pose under a static scene, the advantage complementation of the two devices is carried out, and the positioning precision is improved. In addition, the independent camera and IMU positioning system do not depend on external signals, so the positioning system based on the VIO can realize more accurate position change estimation at the place where the GNSS signals are weak, and the positioning system is particularly suitable for the working condition that the GNSS signals are blocked.
In summary, the combination of the visual and inertial navigation internal data and the GNSS external data can solve the problem of GNSS signal loss of certain specific scenes, integrally improve the positioning accuracy, and is beneficial to application development of various application scenes in a deeper level.
Disclosure of Invention
In view of the foregoing, embodiments of the present application provide a check code generation scheme to at least partially solve the above-mentioned problems.
According to a first aspect of an embodiment of the present application, there is provided a navigation positioning method based on data fusion, including: initializing a visual inertial navigation positioning coordinate system and a GNSS positioning coordinate system, so that visual inertial navigation positioning data are converted into corresponding GNSS positioning data expression forms, and alignment of the coordinate systems is realized; starting to acquire visual inertial navigation positioning data and GNSS positioning data; judging whether the GNSS positioning meets the preset condition or not; if yes, adopting the GNSS positioning data as a plane positioning result, otherwise, adopting the visual inertial navigation positioning data as a plane positioning result; and taking the height data in the visual inertial navigation positioning data as a height positioning result and fusing and outputting the height positioning result and the plane positioning result.
Optionally, the initialization of the visual inertial navigation positioning coordinate system and the GNSS positioning coordinate system is completed by initializing the course angle and the scale of the two coordinate systems, and the roll angle and the pitch angle of any coordinate system are not transformed.
Optionally, g=c·v, G is a three-dimensional coordinate matrix of the GNSS positioning coordinate system, V is a three-dimensional coordinate matrix of the visual inertial navigation positioning coordinate system, and the rotation matrix C for converting the visual inertial navigation positioning coordinate into the GNSS positioning coordinate is
Figure BDA0003446865330000021
And θ is a deviation value of course angles under the two coordinate systems.
Optionally, whether the GNSS positioning meets the preset condition is determined by whether the HDOP value is greater than or equal to a threshold.
Optionally, when the GNSS positioning meets a preset condition, firstly, performing time alignment and coordinate system alignment on the visual inertial navigation positioning and the GNSS positioning, and then performing loose coupling nonlinear optimization on the data of the two to obtain a fused global positioning result.
Optionally, when the GNSS positioning does not meet the preset condition, the visual inertial navigation positioning and the GNSS positioning data are not fused, and the local positioning information of the visual inertial navigation positioning is directly globally used to obtain a short-term global positioning result.
Optionally, the method further comprises comparing and analyzing the GNSS positioning data and the visual inertial navigation data to eliminate accumulated errors.
Optionally, the visual inertial navigation data comprises image data and inertial navigation data, the frequency of the GNSS positioning data is 10Hz, and the visual inertial navigation data comprises dynamic image data with a frame rate of 15-60fps and inertial navigation data with a frequency of 50-400 Hz.
Optionally, the visual inertial navigation positioning includes initializing data input by the local positioning system, and the visual inertial navigation positioning is composed of visual navigation and inertial navigation.
Optionally, the initializing operation of the data input by the local positioning system includes performing front-end visual tracking and SFM visual pose solving on visual navigation data, pre-integrating inertial navigation data, and aligning the visual navigation data and the inertial navigation data to obtain a visual scale, a gravity acceleration direction and a system initial speed.
Optionally, the method further comprises performing close-coupling local nonlinear optimization of the sliding window on the visual navigation data and the inertial navigation data, namely performing nonlinear optimization on the images of the key frames in the sliding window and the inertial navigation data, constructing the visual navigation constraint and the inertial navigation constraint into cost functions through a beam adjustment method, and minimizing the cost functions through a nonlinear optimization least square method to obtain an optimized visual inertial navigation local positioning result.
According to a second aspect of embodiments of the present application, there is provided a positioning device, including: the visual inertial navigation positioning module is used for processing the acquired visual image data and inertial navigation data to generate visual inertial navigation positioning data; the GNSS positioning module is used for acquiring and calculating GNSS signal data and generating GNSS positioning data; and the analysis fusion module is used for partially fusing or integrally fusing the visual inertial navigation positioning data and the GNSS positioning data according to the judgment result of the GNSS so as to output fused positioning data containing the height positioning data.
According to a third aspect of embodiments of the present application, there is provided a positioning device, including: the system comprises a visual navigation module and/or an inertial navigation module, wherein the visual navigation module is used for obtaining visual image data, and the inertial navigation module is used for obtaining inertial navigation data so as to generate positioning data based on the visual image data and/or the inertial navigation data; the GNSS positioning module is used for acquiring and calculating GNSS signal data and generating GNSS positioning data; and the analysis fusion module is used for partially fusing or integrally fusing the positioning data based on the visual image data and/or the inertial navigation data with the GNSS positioning data according to the judgment result of the GNSS so as to output fused positioning data containing the height positioning data.
According to a fourth aspect of embodiments of the present application, there is provided a positioning system comprising: the system comprises a processor, a visual sensor, an inertial sensor and a GNSS positioning sensor; the processor calculates data obtained by the visual sensor and/or the inertial sensor to obtain local positioning information, and the GNSS positioning sensor is used for receiving GNSS positioning data; the processor judges whether GNSS positioning meets preset conditions or not; if yes, adopting the GNSS positioning data as a plane positioning result, otherwise, adopting the vision sensor data and/or the inertial sensor data as a plane positioning result; the visual sensor data and/or the height information in the inertial sensor data are used as a height positioning result to be fused and output with the plane positioning result; the processor is also used for initializing the vision sensor data and/or the inertial sensor data, realizing coordinate system alignment and performing fusion calculation on data from different positioning systems.
The method and the device have the beneficial effects that the integration of different positioning mechanisms can be realized by fully utilizing and exerting the data acquisition functions of the inertial sensors, the image sensors and the positioning sensors of mobile phones, automobile recorders and vehicles, the navigation precision of the electronic equipment is improved to submicron level on the premise of not introducing new media and devices, and the accumulated error of the visual inertial navigation positioning data is eliminated by introducing GNSS data under a certain scene to correct the visual inertial navigation positioning data, so that a single visual inertial navigation data positioning algorithm is continuously optimized, and the visual inertial navigation algorithm and the result precision are continuously optimized along with the increase of the using data quantity. Non-limiting, the beneficial effect according to this application embodiment still includes, has promoted the accuracy of single high location data based on GNSS location by fusion GNSS location data and vision inertial navigation location data by a wide margin, thereby help judging whether the electron device is located bridge, culvert etc. and thereby switches the location data source, in other words, the introduction of high location data has in turn promoted the promotion of the accurate determination of plane location data.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the following description will briefly introduce the drawings that are required to be used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments described in the embodiments of the present application, and other drawings may also be obtained according to these drawings for a person having ordinary skill in the art.
FIG. 1 is a flow chart of a positioning method based on data fusion, to which embodiments of the present application are applied;
FIG. 2 is a flow chart of the positioning method according to a second embodiment of the present application;
FIG. 3 is a block diagram of a positioning device according to a third embodiment of the present application;
fig. 4 is a block diagram of a positioning device 50 according to a third embodiment of the present application;
FIG. 5 is a graph comparing experimental results of a fusion positioning method and a conventional GNSS positioning method according to an embodiment of the present application;
FIG. 6 is a height trajectory formed as a bridge passes by using a fusion positioning method according to an embodiment of the present application.
Detailed Description
In order to better understand the technical solutions in the embodiments of the present application, the following descriptions will clearly and completely describe the technical solutions in the embodiments of the present application with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the embodiments of the present application shall fall within the scope of protection of the embodiments of the present application.
Embodiments of the present application are further described, without limitation, in the following description in conjunction with the accompanying drawings of embodiments of the present application. In the following, unless otherwise specified, "navigation," "positioning," "navigational positioning" are meant to be substantially the same, meaning that position information and/or track information of an electronic device is collected; "visual inertial navigation" is understood to mean a technique incorporating visual positioning navigation and inertial positioning navigation, and may also be denoted as "visual, inertial navigation", in which at least one of visual positioning navigation data and inertial navigation data is acquired simultaneously. Furthermore, although embodiments according to the present application are described below with GPS as an example, it is readily understood that the United states GPS, russian GLONASS, european Union GALILEO, china Beidou satellite navigation system, and the like all belong to representative GNSS systems, and that other GNSS systems are within the meaning and spirit of GNSS in the present application so long as they use high altitude positioning systems, including satellite positioning, hot air balloon positioning, aircraft positioning systems, and the like. And, the device for executing the navigation positioning method according to the embodiment of the invention reserves but is not limited to intelligent automobiles, vehicles with a car machine, vehicles with a central control machine/upper computer, motorcycles, trains, planes and other vehicles, and mobile phones, tablet computers, watches, intelligent glasses and other electronic devices.
Example 1
The specific embodiments of the present invention are as follows: as shown in fig. 1, the system first performs initialization S101 of the coordinate system. The coordinate system of the visual inertial navigation positioning system is compared with the coordinate system of the GNSS positioning system, the pitch angle is consistent with the roll angle, and the course angle has a constant deviation. To convert the positioning result form (i.e. coordinate value) under the visual inertial navigation positioning system into the positioning result form under the GNSS positioning system, a certain operation is required to be performed on the three-dimensional coordinate values to eliminate the deviation of the two coordinate systems in the course angle. Assume that the three-dimensional coordinate value of the positioned position obtained under the visual inertial navigation positioning system is [ x, y, z ]]The corresponding position values under the northeast GNSS world coordinate system converted to the same origin are [ X, Y, Z ]]. v is the coordinate system under the visual inertial navigation positioning system according to the present embodiment, g is the northeast GNSS world coordinate system,
Figure BDA0003446865330000041
for the rotation matrix of the visual inertial navigation positioning system coordinate system to the northeast GNSS world coordinate system, the coordinate conversion formula can be obtained as follows:
Figure BDA0003446865330000051
assuming that the deviation value of the course angle is theta, then
Figure BDA0003446865330000052
In the actual alignment process, after the yaw angles of the two systems are obtained, the positioning result of the visual inertial navigation positioning system is converted into corresponding coordinate values under the northeast GNSS world coordinate system of the same origin through the formula, so that the initialization of the attitude angles of different coordinate systems of the system is completed, and then the subsequent steps are continuously executed.
According to the second step of fig. 1, an initialization S102 on the coordinate scale is performed. When the GPS world coordinate system is regarded as a geographic coordinate system in a certain geographic range, the unit value is 'degree', the X axis is longitude and the Y axis is latitude. The coordinate system of the visual and inertial navigation positioning system takes 'meters' as a unit, and the X, Y and Z values obtained after rotation are measured by meters, so that the unit conversion is not directly added. Regarding the earth as a perfect circle, the longitude of each 1 corresponds to cos (latitude) multiplied by 111110 meters on the same latitude line; on the same meridian, each 1 latitude corresponds to about 111110 meters, and the conversion between meters and longitudes and latitudes can be obtained by considering the linear relationship:
Figure BDA0003446865330000053
Figure BDA0003446865330000054
acquiring initial longitude and latitude values at the initial application time, and respectively adding the longitude and latitude change values X after X and Y conversion longitude And Y latitude Thus, the geographic coordinate system coordinates of the final corresponding map are obtained, and approximate coordinate translation transformation is realized.
The three-dimensional coordinate values [ X, Y, Z ] after the course angle conversion of the coordinate system is obtained through S101]At this time, only two-dimensional coordinate values [ X, Y ] are needed to be used for displaying the corresponding position on the map]. Then the calculation of the above formula is carried out to obtain X after the conversion of the scale longitude Coordinates and Y latitude The coordinates can be used for displaying the positioning result value obtained by the visual inertial navigation positioning system on the two-dimensional geographic map.
Next, the visual inertial navigation positioning data and the GNSS positioning data may be collected by the visual sensor, the inertial sensor, and the GNSS positioning sensor of the electronic device, that is, the execution of S103 is started.
Next, as shown in S104 in fig. 1, the determination of the GNSS positioning accuracy is started, and the determination is mainly determined according to the GNSS signal strength. The visual inertial navigation positioning system is used for fusing or replacing the GNSS positioning system, and the characteristics of the positioning values of the two positioning systems are firstly required to be analyzed. Compared with the two, the precision of the positioning value of the GNSS is poor, the maximum precision can only reach 5 meters, but the GNSS can not bring accumulated errors after long-term operation; the short-time positioning value of the visual and inertial navigation positioning system has high precision, is within meters, but can contain obvious accumulated errors after long-time operation. The time period of poor precision of the outdoor GNSS is limited, such as passing through an underground passage, an underground street or a block with poor signal, and the accumulated error of the operation of the visual inertial navigation positioning system is smaller in the shorter time period, so that the trouble of avoiding the error can be omitted. And judging the GNSS positioning accuracy, wherein if the GNSS positioning accuracy is high, the GNSS system is adopted to acquire a positioning result, the accumulated error brought by the visual inertial navigation system is eliminated, the visual inertial navigation system is used for positioning in height, and the height positioning accuracy of the system is enhanced, as shown in S105 in FIG. 1. If the GNSS signal suffers from a loss of occlusion, the GNSS positioning accuracy is poor, and the system is switched to a visual inertial navigation positioning system as shown in S106 in fig. 1, and accurate positioning is performed by means of data from the inside without interference from external signals. And after one of the two positioning modes is selected, the GNSS positioning precision is continuously judged in real time, and the two modes are switched in real time according to specific conditions.
Further, in S104, whether the GNSS meets the preset condition may be determined according to whether the HDOP value of the GNSS signal is less than or equal to the preset threshold, which means that the signal strength is very good and the highest confidence level is suitable for applications requiring the highest precision in all weather when the HDOP is 1; the strength and confidence interval of the signal are considered to meet the positioning accuracy requirements of a typical navigation when HDOP is less than 5, and generally, the navigation results are low-level confidence levels when it is greater than 5, should be discarded, or only a very initial estimate of the current position. Thus, different initial thresholds for HDOPs may be defined according to the difference in usage scenario, for example, set to 5 in general city conditions, and the thresholds may be adjusted according to the intensity of the local navigation satellite signal, with the initial value of the local threshold being generally greater for lower intensity.
As shown in FIG. 5, according to the embodiment, the positioning experiment data comparison diagram of the visual inertial navigation and GNSS multi-sensor fusion positioning system and the single GNSS sensor positioning system shows that the blue visual inertial navigation and GNSS multi-sensor fusion positioning system has a positioning track, and the red track is the track of the single GNSS. Since no assistance from the visual, inertial navigation positioning system is required when the accuracy of the GNSS is determined to meet the requirements by the HDOP values, there is coincidence of the red and blue trajectories (blue superimposed under red). The accuracy of the GNSS will generally be reduced only for a short time in the open air, and the GNSS with larger error will be replaced by the visual and inertial navigation positioning system in the short time period. In order to more obviously reflect the positioning effect after fusion, the experiment particularly selects blocks with more stories and trees. The probability of the GNSS precision reduction is higher on the street, the actual walking route is always parallel to the main road, the track deviation is large when the GNSS precision reduction is lower, the track deviation drifts to the center of the road, and the visual inertial navigation positioning system fuses the GPS positioning track, so that the large error is avoided, and the real walking route is closer to the actual route, namely, the road is straight and parallel to the main road. When the GNSS recovery precision is high enough through the HDOP value, the GNSS takes over the vision and inertial navigation positioning system to start positioning.
According to fig. 6, the trajectory is the trajectory of the visual inertial navigation positioning system when passing through the overpass (lower left image). The actual height of the overpass is 5 meters, and in repeated tests, the height value of the highest point returned by the visual inertial navigation positioning system is within the range of 5.10-5.35, and the error is below meters. The height estimation of the visual inertial navigation positioning system is thus very accurate.
The method of properly fusing visual inertial navigation positioning data and GNSS positioning data of the present embodiment may be performed by any suitable electronic device having data processing capabilities, including, but not limited to: servers, mobile terminals (such as mobile phones, PADs, etc.), and PCs, etc.
The navigation positioning method of the embodiment is used for realizing higher-precision navigation positioning under different working condition scenes such as an overpass and a neighborhood, and has the beneficial effects of the corresponding method embodiment, and is not repeated herein. In addition, the functional implementation of each step in the navigation positioning method of the present embodiment may refer to the description of the corresponding part in the foregoing or the following method embodiments, which is not repeated herein.
Example two
Referring to fig. 2, there is shown a schematic step diagram of a navigation positioning method according to a second embodiment of the present application, and the specific embodiments of the present application do not specifically limit specific operations for performing the method, but provide the following implementable modes and optimizations.
As shown in fig. 2, step 201 is to initialize the GNSS and visual inertial navigation coordinate system transformation, and next step 202 is to perform the initialized GNSS and visual inertial navigation coordinate scaling transformation, and then start S203 to acquire GNSS and visual inertial navigation positioning data simultaneously. It is to be understood that S203 may also be disposed before step S201, and may specifically include: the mobile device is provided with a monocular camera or other visual sensing device, an inertial navigation module fixedly connected with the camera as a local (local) positioning system, a satellite receiving device with GPS (also coordinate signal) signals as a global (world) positioning system, image data with a frame rate of 30fps, IMU (inertial navigation) data with a frequency of 200Hz and GPS signals with a frequency of 10Hz are collected during the moving process of the device and serve as inputs of the whole positioning algorithm, and the steps S201 and S202 can be actually understood to be performed simultaneously, including initializing the input data of the local positioning system: performing front-end visual tracking and SFM visual pose solving on the image, pre-integrating IMU data, aligning the data to obtain a visual scale, a gravity acceleration direction and a system initial speed, and further performing close-coupling local nonlinear optimization of a sliding window on the image and the IMU data: and carrying out nonlinear optimization on the image of the key frame in the sliding window and IMU data, constructing a cost function by a visual constraint and an IMU constraint through a beam adjustment method, and minimizing the cost function through a nonlinear optimization least square method to obtain an optimized local positioning result.
The step S204 of determining whether the GNSS data meets the preset conditions is similar to the first embodiment, and is not repeated, and may be performed by determining whether the HDOP value is less than or equal to the threshold value, but it should be understood that other indexes such as the geometric precision factor, the three-dimensional position precision factor, the clock error precision factor, the horizontal component precision factor, and the vertical component precision factor may be used to determine whether the GNSS meets the preset conditions, and the threshold values of these factors may be varied or be within a range, where the above determination manners are all contrary to the spirit of the present application and all belong to the protection scope of the present application.
Similar to the embodiment, when the GNSS satisfies the preset condition, S205 is started to be executed, the GNSS positioning result is taken and the visual inertial navigation data is used as the altitude positioning result, and if the GNSS does not satisfy the preset condition, S207 is started to be executed, the visual inertial navigation positioning result is taken and the coordinate transformation is performed.
Further, the accuracy of the data of the visual inertial navigation can be calibrated by utilizing the advantage that the GNSS positioning has no accumulated error, and the calibration methods aiming at different places and time are recorded, so that the calibration method closest to the requirement can be timely called when necessary to enhance the result accuracy determination of the visual inertial navigation positioning, and it is easy to understand that the plane positioning output by S206 is still solely from GNSS positioning data and the height data is from the visual inertial navigation positioning as the calibration of the output result to eliminate the accumulated error of the visual inertial navigation positioning.
It should be noted that, according to implementation requirements, each component/step described in the embodiments of the present application may be split into more components/steps, and two or more components/steps or part of operations of the components/steps may be combined into new components/steps, so as to achieve the purposes of the embodiments of the present application.
Example III
As shown in fig. 3, a positioning device 40 according to a third embodiment of the present invention is provided, which includes a visual inertial navigation positioning module 401, a GNSS positioning module 402, and an analysis fusion module 403.
The positioning device 40 may be an electronic device such as a vehicle, a cell phone, etc. having a visual sensor, an inertial navigation sensor, and a data processor.
The visual inertial navigation positioning module 401 is an electronic device including a visual/image sensor CCD or CMOS or the like, which can acquire images in real time, and also has an inertial navigation sensor/inertial sensor mainly for detecting and measuring acceleration, tilt, shock, vibration, rotation, and multiple degree of freedom (DoF) motions, the inertial sensor including an accelerometer (or accelerometer) and an angular velocity sensor (gyro) and their single, double, and triple axis combined IMU (inertial measurement unit), an AHRS (attitude reference system including a magnetic sensor), so that accumulated stroke and direction and the like are calculated by analyzing data such as images and speeds to realize local navigation positioning.
The GNSS positioning module 402 refers to a positioning navigation module of a receiver type covering at least one of a plurality of GPS, beidou, GALILEO, GLONASS satellite positioning systems.
The analysis fusion module 403 may be any unit with a data processing function, such as MCU, CPU, GPU, for performing image processing and data calculation, and it is easy to understand that the function of the analysis fusion module is to execute the steps in the first embodiment and the second embodiment of the present invention, but the module may also be partially disposed in the cloud end and partially disposed in the electronic device, or completely disposed in the cloud end.
Example IV
As shown in fig. 4, a positioning device 50 is provided according to the third embodiment of the present invention, in which the visual inertial navigation module 401 in the fourth embodiment is split into at least one of two independent modules, namely, the visual navigation module 501 and the inertial navigation module 502, in other words, only the visual navigation module 501 or the inertial navigation module 502 may be adopted according to the actual situation, so as to enhance the accuracy of GNSS positioning to a certain extent, and if the condition allows, the visual inertial navigation module 501 and the inertial navigation module 502 are preferably adopted together to realize the visual inertial navigation positioning to fuse with the GNSS positioning to enhance the navigation positioning accuracy.
The visual navigation module 501 is an electronic device that includes a visual/image sensor CCD or CMOS or the like and can acquire images in real time.
The inertial navigation module 502 is an inertial navigation sensor/inertial sensor that is primarily used to detect and measure acceleration, tilt, shock, vibration, rotation, and multiple degree of freedom (DoF) motions, including accelerometers (or accelerometer) and angular velocity sensors (gyroscopes) and their single, dual, tri-axial combined IMUs (inertial measurement units), AHRS (attitude reference system including magnetic sensors).
The GNSS positioning module 503 refers to a positioning navigation module of which the receiver type covers at least one of a plurality of satellite positioning systems of GPS, beidou, GALILEO, GLONASS.
The analysis fusion module 504 may be any unit with a data processing function, such as MCU, CPU, GPU, for performing image processing and data calculation, and it is easy to understand that the function of the analysis fusion module is to execute the steps in the first embodiment and the second embodiment of the present invention, but the module may also be partially disposed in the cloud end and partially disposed in the electronic device, or completely disposed in the cloud end.
Example six
Although not shown, the present application also provides a positioning system comprising: the system comprises a processor, a visual sensor, an inertial sensor and a GNSS positioning sensor; the processor calculates data obtained by the visual sensor and/or the inertial sensor to obtain local positioning information, and the GNSS positioning sensor is used for receiving GNSS positioning data; then starting to collect GNSS and visual inertial navigation positioning data simultaneously, and then judging whether the GNSS positioning meets preset conditions by the processor; if yes, adopting the GNSS positioning data as a plane positioning result, otherwise, adopting the vision sensor data and/or the inertial sensor data as a plane positioning result; the visual sensor data and/or the height information in the inertial sensor data are used as a height positioning result to be fused and output with the plane positioning result; the processor is also used for initializing the vision sensor data and/or the inertial sensor data, realizing coordinate system alignment and performing fusion calculation on data from different positioning systems.
The above-described methods according to embodiments of the present application may be implemented in hardware, firmware, or as software or computer code storable in a recording medium such as a CD ROM, RAM, floppy disk, hard disk, or magneto-optical disk, or as computer code originally stored in a remote recording medium or a non-transitory machine-readable medium and to be stored in a local recording medium downloaded through a network, so that the methods described herein may be stored on such software processes on a recording medium using a general purpose computer, special purpose processor, or programmable or special purpose hardware such as an ASIC or FPGA. It is understood that a computer, processor, microprocessor controller, or programmable hardware includes a memory component (e.g., RAM, ROM, flash memory, etc.) that can store or receive software or computer code that, when accessed and executed by the computer, processor, or hardware, implements the methods of generating the check code described herein. Further, when the general-purpose computer accesses code for implementing the check code generation method shown herein, execution of the code converts the general-purpose computer into a special-purpose computer for executing the check code generation method shown herein.
Those of ordinary skill in the art will appreciate that the elements and method steps of the examples described in connection with the embodiments disclosed herein can be implemented as electronic hardware, or as a combination of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
The above embodiments are only for illustrating the embodiments of the present application, but not for limiting the embodiments of the present application, and various changes and modifications can be made by one skilled in the relevant art without departing from the spirit and scope of the embodiments of the present application, so that all equivalent technical solutions also fall within the scope of the embodiments of the present application, and the scope of the embodiments of the present application should be defined by the claims.

Claims (14)

1. The navigation positioning method based on data fusion is characterized by comprising the following steps:
initializing a visual inertial navigation positioning coordinate system and a GNSS positioning coordinate system, so that visual inertial navigation positioning data are converted into corresponding GNSS positioning data expression forms, and alignment of the coordinate systems is realized;
starting to acquire visual inertial navigation positioning data and GNSS positioning data;
judging whether the GNSS positioning meets the preset condition or not;
if yes, adopting the GNSS positioning data as a plane positioning result, otherwise, adopting the visual inertial navigation positioning data as a plane positioning result;
and taking the height data in the visual inertial navigation positioning data as a height positioning result and fusing and outputting the height positioning result and the plane positioning result.
2. The method of claim 1, wherein initializing the visual inertial navigation positioning coordinate system and the GNSS positioning coordinate system is accomplished by initializing heading angles and scales of the two coordinate systems without transforming roll angles and pitch angles of either coordinate system.
3. The method of claim 1, wherein G = C-V, G is a three-dimensional coordinate matrix of a GNSS positioning coordinate system, V is a three-dimensional coordinate matrix of a visual inertial navigation positioning coordinate system, and the rotation matrix C that converts visual inertial navigation positioning coordinates into GNSS positioning coordinates is
Figure FDA0003446865320000011
And θ is a deviation value of course angles under the two coordinate systems.
4. The method of claim 1, wherein determining whether the GNSS positioning satisfies the preset condition is performed by determining whether the HDOP value is equal to or greater than a threshold value.
5. The method of claim 1, wherein when the GNSS positioning meets a preset condition, the visual inertial navigation positioning is aligned with the GNSS positioning in time and aligned with a coordinate system, and then the data of the two are subjected to loose coupling nonlinear optimization to obtain a fused global positioning result.
6. The method of claim 1, wherein when the GNSS positioning does not meet the preset condition, the local positioning information of the visual inertial navigation positioning is directly globally used without fusing the data of the visual inertial navigation positioning and the GNSS positioning, so as to obtain a short-term global positioning result.
7. The method of claim 1 or 6, further comprising comparing the GNSS positioning data and the visual inertial navigation data to eliminate accumulated errors.
8. The method of claim 1, wherein the visual inertial navigation data comprises image data and inertial navigation data, the GNSS positioning data having a frequency of 10Hz, the visual inertial navigation data comprising dynamic image data having a frame rate of 15-60fps and inertial navigation data having a frequency of 50-400 Hz.
9. The method of claim 1, wherein the visual inertial navigation positioning comprises initializing data entered by a local positioning system, the visual inertial navigation positioning being comprised of visual navigation and inertial navigation.
10. The method of claim 9, wherein initializing the data input by the local positioning system comprises front-end visual tracking and SFM visual pose solving of visual navigation data, pre-integrating inertial navigation data, and aligning the visual navigation and inertial navigation data to obtain a visual scale, a gravitational acceleration direction, and a system initial speed.
11. The method of claim 10, further comprising performing a close-coupled local nonlinear optimization of the sliding window on the visual navigation data and the inertial navigation data, i.e., performing a nonlinear optimization on the images of the key frames in the sliding window and the inertial navigation data, constructing the visual navigation constraint and the inertial navigation constraint into cost functions by a beam adjustment method, and minimizing the cost functions by a nonlinear optimization least square method to obtain an optimized visual inertial navigation local positioning result.
12. A positioning device, comprising:
the visual inertial navigation positioning module is used for processing the acquired visual image data and inertial navigation data to generate visual inertial navigation positioning data;
the GNSS positioning module is used for acquiring and calculating GNSS signal data and generating GNSS positioning data;
and the analysis fusion module is used for partially fusing or integrally fusing the visual inertial navigation positioning data and the GNSS positioning data according to the judgment result of the GNSS so as to output fused positioning data containing the height positioning data.
13. A positioning device, comprising:
the system comprises a visual navigation module and/or an inertial navigation module, wherein the visual navigation module is used for obtaining visual image data, and the inertial navigation module is used for obtaining inertial navigation data so as to generate positioning data based on the visual image data and/or the inertial navigation data;
the GNSS positioning module is used for acquiring and calculating GNSS signal data and generating GNSS positioning data;
and the analysis fusion module is used for partially fusing or integrally fusing the positioning data based on the visual image data and/or the inertial navigation data with the GNSS positioning data according to the judgment result of the GNSS so as to output fused positioning data containing the height positioning data.
14. A positioning system, comprising:
the system comprises a processor, a visual sensor, an inertial sensor and a GNSS positioning sensor;
the processor calculates data obtained by the visual sensor and/or the inertial sensor to obtain local positioning information, and the GNSS positioning sensor is used for receiving GNSS positioning data;
the processor judges whether GNSS positioning meets preset conditions or not;
if yes, adopting the GNSS positioning data as a plane positioning result, otherwise, adopting the vision sensor data and/or the inertial sensor data as a plane positioning result;
the visual sensor data and/or the height information in the inertial sensor data are used as a height positioning result to be fused and output with the plane positioning result;
the processor is also used for initializing the vision sensor data and/or the inertial sensor data, realizing coordinate system alignment and performing fusion calculation on data from different positioning systems.
CN202111660614.9A 2021-12-30 2021-12-30 Navigation positioning method, positioning device and positioning system based on data fusion Pending CN116412816A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111660614.9A CN116412816A (en) 2021-12-30 2021-12-30 Navigation positioning method, positioning device and positioning system based on data fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111660614.9A CN116412816A (en) 2021-12-30 2021-12-30 Navigation positioning method, positioning device and positioning system based on data fusion

Publications (1)

Publication Number Publication Date
CN116412816A true CN116412816A (en) 2023-07-11

Family

ID=87058324

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111660614.9A Pending CN116412816A (en) 2021-12-30 2021-12-30 Navigation positioning method, positioning device and positioning system based on data fusion

Country Status (1)

Country Link
CN (1) CN116412816A (en)

Similar Documents

Publication Publication Date Title
CN109931926B (en) Unmanned aerial vehicle seamless autonomous navigation method based on station-core coordinate system
CN107328411B (en) Vehicle-mounted positioning system and automatic driving vehicle
KR100518852B1 (en) Method for dead reckoning for backward improvement of mobile
CN110307836B (en) Accurate positioning method for welt cleaning of unmanned cleaning vehicle
CN106168485B (en) Walking track data projectional technique and device
CN109916394A (en) Combined navigation algorithm fusing optical flow position and speed information
WO2016198009A1 (en) Heading checking method and apparatus
CN104121905A (en) Course angle obtaining method based on inertial sensor
CN104713554A (en) Indoor positioning method based on MEMS insert device and android smart mobile phone fusion
WO2014134710A1 (en) Method and apparatus for fast magnetometer calibration
US10228252B2 (en) Method and apparatus for using multiple filters for enhanced portable navigation
TWI522258B (en) Based on electronic map, global navigation satellite system and vehicle motion detection technology Lane identification method
CN111025366B (en) Grid SLAM navigation system and method based on INS and GNSS
CN114018242B (en) Autonomous attitude determination method based on polarization/sun/inertia information intelligent matching
Mu et al. A GNSS/INS-integrated system for an arbitrarily mounted land vehicle navigation device
JP4986883B2 (en) Orientation device, orientation method and orientation program
CN110057356A (en) Vehicle positioning method and device in a kind of tunnel
CN102538790A (en) Method for solving difference of gyroscope parameters in inertial navigation
CN110018503B (en) Vehicle positioning method and positioning system
Chiang et al. Multifusion schemes of INS/GNSS/GCPs/V-SLAM applied using data from smartphone sensors for land vehicular navigation applications
CN113063441B (en) Data source correction method and device for accumulated calculation error of odometer
JP2006119144A (en) Road linearity automatic survey device
CN117232506A (en) Military mobile equipment positioning system under complex battlefield environment
CN117268408A (en) Laser slam positioning method and system
CN116625359A (en) Visual inertial positioning method and device for self-adaptive fusion of single-frequency RTK

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination