CN110595464A - IMU and visual sensor fusion positioning method and device - Google Patents

IMU and visual sensor fusion positioning method and device Download PDF

Info

Publication number
CN110595464A
CN110595464A CN201910764527.4A CN201910764527A CN110595464A CN 110595464 A CN110595464 A CN 110595464A CN 201910764527 A CN201910764527 A CN 201910764527A CN 110595464 A CN110595464 A CN 110595464A
Authority
CN
China
Prior art keywords
information
displacement
positioning
orientation
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910764527.4A
Other languages
Chinese (zh)
Inventor
王媛
刘永欣
郭金辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Digital Research Technology Development Co Ltd
Original Assignee
Beijing Digital Research Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Digital Research Technology Development Co Ltd filed Critical Beijing Digital Research Technology Development Co Ltd
Priority to CN201910764527.4A priority Critical patent/CN110595464A/en
Publication of CN110595464A publication Critical patent/CN110595464A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation

Abstract

The invention provides a fusion positioning method and a fusion positioning device for an IMU (inertial measurement Unit) and a visual sensor, wherein the positioning method can fully exert the advantages of the IMU sensor and the visual sensor of an intelligent terminal, take the more optimal orientation change in the visual positioning method and the more optimal displacement in a PDR (product data record) method as the reference, and eliminate 'mutation' data in the more optimal data through the fusion of the two methods, so that the optimal orientation change and displacement combination is obtained, and the 'process-level' fusion is realized. Moreover, the partition range can be adjusted according to the data characteristics, and the threshold value can be flexibly set according to the data quality so as to achieve the optimal fusion positioning result. Meanwhile, the positioning method disclosed by the embodiment of the invention has lower requirements on the computing performance of equipment, and can realize relative positioning and track tracking without the support of special hardware equipment (such as GPU module acceleration).

Description

IMU and visual sensor fusion positioning method and device
Technical Field
The invention relates to the technical field of positioning, in particular to a fusion positioning method and device of an IMU and a vision sensor.
Background
With the gradual improvement of global satellite navigation system and the rapid development of mobile internet and wireless communication technology, location-based services have a great position. With the development of personal mobile terminals, positioning based on intelligent terminals and services thereof are increasingly dominating in the navigation positioning market.
In the prior art, GNSS can provide excellent location information in an outdoor open area, but people are active 80% of the time. With the increasing volume and complexity of the activity space of people, the demand of location services is increasing, such as reverse car searching in parking lots, finding a specific commodity, locating scattered family members, and the like. Meanwhile, industries such as precise marketing, intelligent manufacturing, robots and unmanned medical care also need a computer to be capable of identifying the position of a specific object, and accurate position information needs to be used in the fields of special crowd positioning, large-scale venue management, internet of things and personal position service. Especially, in emergency scenes, such as fire rescue, emergency evacuation, earthquake relief and the like, the positioning information of weak GNSS even areas without GNSS signals is very important.
Disclosure of Invention
The invention provides a fusion positioning method of an IMU (inertial measurement Unit) and a vision sensor, which is used for solving the problems that the positioning technology in the prior art has a plurality of application limitations according to the positioning performance, and simultaneously, the indoor environment has a plurality of obstacles, a plurality of interference sources, a complex building structure, an environment position and the like.
In a first aspect, an embodiment of the present invention provides a method for fusion positioning of an IMU and a visual sensor, which specifically includes:
determining first orientation information and first displacement information of the device to be detected based on a first positioning algorithm;
determining second orientation information and second displacement information of the device to be detected based on a second positioning algorithm;
determining an orientation difference between the first orientation information and the second orientation information, a displacement difference between the first displacement information and the second displacement information;
respectively determining a first weight of the first orientation information and a second weight of the second displacement information according to the orientation difference and the displacement difference based on a preset weight determination rule;
and determining the positioning information of the equipment to be detected according to the first orientation information, the first weight of the first orientation information, the first displacement information, the second orientation information, the second displacement information and the second weight of the second displacement information.
In a possible implementation manner, the determining, according to the first orientation information, the first weight of the first orientation information, the first displacement information, the second orientation information, the second displacement information, and the second weight of the second displacement information, the positioning information of the device to be detected includes: determining result orientation information of the equipment to be detected according to the first orientation information, the first weight of the first orientation information and the second orientation information; determining result displacement information of the equipment to be detected according to the first displacement information, the second displacement information and a second weight of the second displacement information; and determining the positioning information of the equipment to be detected according to the result orientation information and the result displacement information.
In one possible implementation, the method further includes: and constructing the weight determination rule according to the detection precision of the sensor in the equipment to be detected and the environment information of the equipment to be detected.
In one possible implementation, the first positioning algorithm comprises a visual positioning algorithm and the second positioning algorithm comprises pedestrian dead reckoning based on an inertial measurement unit.
In a second aspect, an embodiment of the present invention provides an IMU and visual sensor fusion positioning apparatus, which specifically includes:
the first determining module is used for determining first orientation information and first displacement information of the device to be detected based on a first positioning algorithm;
the second determining module is used for determining second orientation information and second displacement information of the device to be detected based on a second positioning algorithm;
a difference calculation module for determining an orientation difference between the first orientation information and the second orientation information, and a displacement difference between the first displacement information and the second displacement information;
the weight determining module is used for determining a first weight of the first orientation information and a second weight of the second displacement information according to the orientation difference and the displacement difference respectively based on a preset weight determining rule;
and the positioning module is used for determining the positioning information of the equipment to be detected according to the first orientation information, the first weight of the first orientation information, the first displacement information, the second orientation information, the second displacement information and the second weight of the second displacement information.
In one possible implementation, the positioning module includes:
the first positioning unit is used for determining the result orientation information of the equipment to be detected according to the first orientation information, the first weight of the first orientation information and the second orientation information;
the second positioning unit is used for determining the result displacement information of the equipment to be detected according to the first displacement information, the second displacement information and the second weight of the second displacement information;
and determining the positioning information of the equipment to be detected according to the result orientation information and the result displacement information.
In one possible implementation, the apparatus further includes: and the rule establishing module is used for establishing the weight determining rule according to the detection precision of the sensor in the equipment to be detected and the environment information of the equipment to be detected.
In one possible implementation, the first positioning algorithm comprises a visual positioning algorithm and the second positioning algorithm comprises pedestrian dead reckoning based on an inertial measurement unit.
The invention has the beneficial effects that: the optimal orientation change in the positioning method and the optimal displacement in the PDR method are used as the benchmark, and the 'mutation' data in the optimal data is eliminated through the fusion of the two methods, so that the optimal combination of the orientation change and the displacement is obtained, and the 'process-level' fusion is realized.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
fig. 1 is a flowchart of a fusion positioning method of an IMU and a visual sensor according to an embodiment of the present invention.
Fig. 2 is a flowchart of a fusion positioning method of an IMU and a visual sensor according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of a differential partitioning according to an embodiment of the present invention.
Fig. 4 is a schematic diagram of region weight assignment according to an embodiment of the present invention.
Fig. 5 is a schematic structural diagram of a fusion positioning apparatus of an IMU and a visual sensor according to an embodiment of the present invention.
Detailed Description
The preferred embodiments of the present invention will be described in conjunction with the accompanying drawings, and it will be understood that they are described herein for the purpose of illustration and explanation and not limitation.
Fig. 1 is a flowchart of a fusion positioning method of an IMU and a visual sensor according to an embodiment of the present invention. The embodiment is applicable to positioning of intelligent devices located indoors. The method is executed by a positioning device, and specifically comprises the following steps:
s110, determining first orientation information and first displacement information of the device to be detected based on a first positioning algorithm.
And S120, determining second orientation information and second displacement information of the device to be detected based on a second positioning algorithm.
The IMU and vision sensor fusion positioning method can be used in service scenes needing accurate positioning information, such as the positioning of specific objects, special crowd positioning, large-scale venue management, Internet of things, personal position service fields and the like.
The device to be detected is provided with a plurality of sensors, such as a digital compass for measuring the heading of a user, a gyroscope for changing the heading, an accelerometer for measuring the speed of the user, a camera for acquiring images and the like, and the sensors are respectively used for acquiring different types of information.
In this implementation manner, the first orientation information and the first displacement information are determined by a first positioning algorithm, and the second orientation information and the second displacement information are determined by a second positioning algorithm, where the first orientation information and the second orientation information are used to represent terminal posture information of the device to be detected, the first displacement information and the second displacement information, and displacement information of the terminal.
The first positioning algorithm comprises a visual positioning algorithm and the second positioning algorithm comprises a Pedestrian Dead Reckoning (PDR) based inertial measurement unit.
S130, determining an orientation difference value between the first orientation information and the second orientation information and a displacement difference value between the first displacement information and the second displacement information.
And S140, respectively determining a first weight of the first orientation information and a second weight of the second displacement information according to the orientation difference and the displacement difference based on a preset weight determination rule.
In this implementation, the weight determination rule may be pre-constructed, and the first weight and the second weight corresponding to different displacement difference values and different orientation difference values are different.
S150, determining the positioning information of the equipment to be detected according to the first orientation information, the first weight of the first orientation information, the first displacement information, the second orientation information, the second displacement information and the second weight of the second displacement information.
In a possible implementation manner, the determining, according to the first orientation information, the first weight of the first orientation information, the first displacement information, the second orientation information, the second displacement information, and the second weight of the second displacement information, the positioning information of the device to be detected includes: determining result orientation information of the equipment to be detected according to the first orientation information, the first weight of the first orientation information and the second orientation information; determining result displacement information of the equipment to be detected according to the first displacement information, the second displacement information and a second weight of the second displacement information; and determining the positioning information of the equipment to be detected according to the result orientation information and the result displacement information.
In one possible implementation, the method further includes: and constructing the weight determination rule according to the detection precision of the sensor in the equipment to be detected and the environment information of the equipment to be detected. The weight determination rule can be constructed according to the azimuth element in the camera, the drift error of the inertial navigation device and the scale parameter.
Fig. 2 is a flowchart of a fusion positioning method for an IMU and a visual sensor according to an embodiment of the present invention, and as shown in fig. 2, the fusion positioning method for an IMU and a visual sensor includes three stages, i.e., difference partitioning, weight determination, and weighted fusion. Both the visual positioning method and the PDR positioning method can obtain orientation information and displacement information, and the position can be estimated by using the values, so the accuracy of the orientation and displacement values directly determines the accuracy of the position estimation. The fusion result of the Visual positioning method and the PDR positioning method is a Visual Gyroscope and an inertial Odometer (VGO-PDR), and the fusion algorithm is a key for exerting respective advantages of Visual/PDR positioning. The positioning method (namely Difference Partition Weighted Fusion (DPWF)) provided by the invention comprises the following three steps: and calculating a difference value, determining a weight value and performing weighted fusion.
Firstly, calculating the orientation and displacement of the correct positions of a visual positioning algorithm and a PDR, then calculating the difference value of the orientation and displacement under the two methods, then determining the fusion weight of the orientation and displacement values obtained by the visual method and the PDR according to the difference value and by taking the value obtained by the method with higher accuracy as a reference, and finally obtaining the fused positioning result. Through the positioning process, compared with the traditional 'result level' positioning fusion method, the fusion algorithm provided by the research realizes the 'process level' fusion of positioning.
Specifically, the difference partition is used for judging the difference range of the orientation value and the displacement value obtained by the visual positioning method and the PDR positioning method, and is a key point for determining the weight in the fusion calculation. Since both positioning methods are relative positioning, it is very difficult to estimate the error of the respective positioning result in the positioning process. The performance of the overall positioning result can be improved by reducing the errors of the orientation and position estimation with the more accurate value as the reference and the second method as the compensation.
In this implementation, the first step of the difference partition is to calculate the differences between the orientation value and the displacement value of the position obtained by the two methods of visual positioning and PDR positioning and the previous position according to the relative positioning process, where the respective differences are:
wherein, DioVAnd DistVRespectively representing the change of orientation and the displacement, Dio, between the visual positioning result and the nearest fusion positioning pointPAnd DistPIs the corresponding value of PDR. Δ Dio represents the orientation difference in the visual and PDR localization results, and Δ Dist represents the displacement difference for both methods. Secondly, dividing the difference into regions, wherein the angle difference is in the longitudinal direction, and according to experimental experience, the region of the angle difference is between 15 degrees and 15 degrees, and the region is divided into 10 regions; the displacement difference is in the lateral direction and is divided into 10 regions from-0.5 m to 0.5 m, as shown in fig. 3, if Δ Dio is 5 degrees and Δ Dist is 0.25 m, the difference falls in the gray square region.
And in the process of determining the weight, dividing the whole difference area into 100 small squares, distributing corresponding numerical values to each small square as the weight, and determining the weight of the angle and the displacement in the visual positioning and PDR positioning methods by judging the squares where the difference values delta Dio and delta Dist are located. From experimental experience, we know that the average error of orientation calculated using visual images is smaller and the average error of displacement is larger compared to PDR. So we use the more accurate value as the reference, and the larger the difference between the less accurate value and the value, the smaller the weight in calculation. According to this principle, the small squares are assigned with corresponding values according to the magnitude of the difference, and for the convenience of understanding and calculation, the weight of the value with higher accuracy is set to be a fixed value of 1, that is, the orientation value of the visual positioning method and the weight of the displacement value of the PDR method are set to be 1, and the value assigned in the squares is the weight of the value with lower accuracy, that is, the orientation value of the PDR method and the weight of the displacement value of the visual positioning method are respectively denoted as wpo and wvt. According to experimental experience, the ranges of the difference values Δ Dio and Δ Dist and the corresponding precision of each difference value can be roughly judged.
The weight in the embodiment of the invention is taken as a value according to experimental experience, as an example, the value is shown in fig. 4, and the weight value of a gray square area in the graph is wpo=1/2,wvt1/3. Note that the weight takes 0 when the difference is not within the box. I.e. when | [ delta ] Dio +>At 15 deg., wpo0; when | [ delta ] Dist>At 0.5 m, wvt=0。
And a weighted fusion process, wherein the orientation estimated by the PDR, the orientation estimated by the visual image and the displacement estimated by the two methods are dynamically weighted in the weighted fusion process. The PDR estimated orientation and displacement are obtained from the built-in sensor data, while the visually estimated data are accounted for based on the position of vanishing points in successive images. The two methods obtain data independent of each other so that they can be considered uncorrelated localization estimates and a linear weighted fit is performed.
WhereinAndrespectively the estimation results of the orientation change and the displacement of the visual method from the time t-1 to the time t,andare the result of the PDR estimation, w, respectivelypoAnd wvtAs a key weighting factor, Diot-1|tAnd Distt-1|tThen as a result of the orientation change and displacement after fusion.
Then, the positioning result after the visual positioning PDR positioning fusion is estimated by using the mathematical principle of the PDR algorithm:
whereinFor the location estimated by the fused visual positioning and PDR positioning method at time t,the result of the localized fusion at time t-1.
In the implementation mode, the first orientation information and the first displacement information of the equipment to be detected are determined based on a first positioning algorithm; determining second orientation information and second displacement information of the device to be detected based on a second positioning algorithm; determining an orientation difference between the first orientation information and the second orientation information, a displacement difference between the first displacement information and the second displacement information; respectively determining a first weight of the first orientation information and a second weight of the second displacement information according to the orientation difference and the displacement difference based on a preset weight determination rule; and determining the positioning information of the equipment to be detected according to the first orientation information, the first weight of the first orientation information, the first displacement information, the second orientation information, the second displacement information and the second weight of the second displacement information. The advantages of an intelligent terminal IMU sensor and a visual sensor can be fully exerted, the better orientation change in the visual positioning method and the better displacement in the PDR method are used as the reference, and the 'mutation' data in the better data is eliminated through the fusion of the two methods, so that the optimal orientation change and displacement combination is obtained, and the 'process-level' fusion is realized. Moreover, the partition range can be adjusted according to the data characteristics, and the threshold value can be flexibly set according to the data quality so as to achieve the optimal fusion positioning result. Meanwhile, the positioning method disclosed by the embodiment of the invention has lower requirements on the computing performance of equipment, and can realize relative positioning and track tracking without the support of special hardware equipment (such as GPU module acceleration).
Fig. 5 is a view of a positioning apparatus for fusing an IMU and a visual sensor according to a second embodiment of the present invention, and as shown in fig. 5, the positioning apparatus for fusing an IMU and a visual sensor specifically includes:
a first determining module 201, configured to determine first orientation information and first displacement information of the device to be detected based on a first positioning algorithm;
a second determining module 202, configured to determine second orientation information and second displacement information of the device to be detected based on a second positioning algorithm;
a difference calculation module 203 for determining an orientation difference between the first orientation information and the second orientation information, and a displacement difference between the first displacement information and the second displacement information;
a weight determining module 204, configured to determine, based on a preset weight determining rule, a first weight of the first orientation information and a second weight of the second displacement information according to the orientation difference and the displacement difference, respectively;
the positioning module 205 is configured to determine the positioning information of the device to be detected according to the first orientation information, the first weight of the first orientation information, the first displacement information, the second orientation information, the second displacement information, and the second weight of the second displacement information.
In one possible implementation, the positioning module includes: the first positioning unit is used for determining the result orientation information of the equipment to be detected according to the first orientation information, the first weight of the first orientation information and the second orientation information;
the second positioning unit is used for determining the result displacement information of the equipment to be detected according to the first displacement information, the second displacement information and the second weight of the second displacement information;
and determining the positioning information of the equipment to be detected according to the result orientation information and the result displacement information.
In one possible implementation, the apparatus further includes: and the rule establishing module is used for establishing the weight determining rule according to the detection precision of the sensor in the equipment to be detected and the environment information of the equipment to be detected.
In one possible implementation, the first positioning algorithm comprises a visual positioning algorithm and the second positioning algorithm comprises pedestrian dead reckoning based on an inertial measurement unit.
The modeling apparatus for a wiring structure provided in this embodiment can be used to perform the modeling method for a wiring structure provided in the first embodiment, and has corresponding functions and advantages.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (8)

1. A fusion positioning method of an IMU and a vision sensor is characterized by comprising the following steps:
determining first orientation information and first displacement information of the device to be detected based on a first positioning algorithm;
determining second orientation information and second displacement information of the device to be detected based on a second positioning algorithm;
determining an orientation difference between the first orientation information and the second orientation information, a displacement difference between the first displacement information and the second displacement information;
respectively determining a first weight of the first orientation information and a second weight of the second displacement information according to the orientation difference and the displacement difference based on a preset weight determination rule;
and determining the positioning information of the equipment to be detected according to the first orientation information, the first weight of the first orientation information, the first displacement information, the second orientation information, the second displacement information and the second weight of the second displacement information.
2. The method of claim 1, wherein determining the positioning information of the device to be detected according to the first orientation information, the first weight of the first orientation information, the first displacement information, the second orientation information, the second displacement information, and the second weight of the second displacement information comprises:
determining result orientation information of the equipment to be detected according to the first orientation information, the first weight of the first orientation information and the second orientation information;
determining result displacement information of the equipment to be detected according to the first displacement information, the second displacement information and a second weight of the second displacement information;
and determining the positioning information of the equipment to be detected according to the result orientation information and the result displacement information.
3. The method according to any one of claims 1 or 2, characterized in that.
4. The method of claim 1, wherein:
the first positioning algorithm comprises a visual positioning algorithm and the second positioning algorithm comprises pedestrian dead reckoning based on an inertial measurement unit.
5. An IMU and visual sensor fusion positioning device, comprising:
the first determining module is used for determining first orientation information and first displacement information of the device to be detected based on a first positioning algorithm;
the second determining module is used for determining second orientation information and second displacement information of the device to be detected based on a second positioning algorithm;
a difference calculation module for determining an orientation difference between the first orientation information and the second orientation information, and a displacement difference between the first displacement information and the second displacement information;
the weight determining module is used for determining a first weight of the first orientation information and a second weight of the second displacement information according to the orientation difference and the displacement difference respectively based on a preset weight determining rule;
and the positioning module is used for determining the positioning information of the equipment to be detected according to the first orientation information, the first weight of the first orientation information, the first displacement information, the second orientation information, the second displacement information and the second weight of the second displacement information.
6. The apparatus of claim 5, wherein the positioning module comprises:
the first positioning unit is used for determining the result orientation information of the equipment to be detected according to the first orientation information, the first weight of the first orientation information and the second orientation information;
the second positioning unit is used for determining the result displacement information of the equipment to be detected according to the first displacement information, the second displacement information and the second weight of the second displacement information;
and determining the positioning information of the equipment to be detected according to the result orientation information and the result displacement information.
7. The apparatus of any one of claims 5 or 6, further comprising:
and the rule establishing module is used for establishing the weight determining rule according to the detection precision of the sensor in the equipment to be detected and the environment information of the equipment to be detected.
8. The apparatus of claim 5, wherein:
the first positioning algorithm comprises a visual positioning algorithm and the second positioning algorithm comprises pedestrian dead reckoning based on an inertial measurement unit.
CN201910764527.4A 2019-08-19 2019-08-19 IMU and visual sensor fusion positioning method and device Pending CN110595464A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910764527.4A CN110595464A (en) 2019-08-19 2019-08-19 IMU and visual sensor fusion positioning method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910764527.4A CN110595464A (en) 2019-08-19 2019-08-19 IMU and visual sensor fusion positioning method and device

Publications (1)

Publication Number Publication Date
CN110595464A true CN110595464A (en) 2019-12-20

Family

ID=68854600

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910764527.4A Pending CN110595464A (en) 2019-08-19 2019-08-19 IMU and visual sensor fusion positioning method and device

Country Status (1)

Country Link
CN (1) CN110595464A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111220153A (en) * 2020-01-15 2020-06-02 西安交通大学 Positioning method based on visual topological node and inertial navigation
CN113055598A (en) * 2021-03-25 2021-06-29 浙江商汤科技开发有限公司 Orientation data compensation method and device, electronic equipment and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102435188A (en) * 2011-09-15 2012-05-02 南京航空航天大学 Monocular vision/inertia autonomous navigation method for indoor environment
CN107084717A (en) * 2011-12-07 2017-08-22 三星电子株式会社 Mobile terminal and its method for the alignment system based on magnetic field map
CN108362289A (en) * 2018-02-08 2018-08-03 浙江大学城市学院 A kind of mobile intelligent terminal PDR localization methods based on Multi-sensor Fusion
CN108496057A (en) * 2016-01-19 2018-09-04 飞利浦照明控股有限公司 It is positioned based on light source

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102435188A (en) * 2011-09-15 2012-05-02 南京航空航天大学 Monocular vision/inertia autonomous navigation method for indoor environment
CN107084717A (en) * 2011-12-07 2017-08-22 三星电子株式会社 Mobile terminal and its method for the alignment system based on magnetic field map
CN108496057A (en) * 2016-01-19 2018-09-04 飞利浦照明控股有限公司 It is positioned based on light source
CN108362289A (en) * 2018-02-08 2018-08-03 浙江大学城市学院 A kind of mobile intelligent terminal PDR localization methods based on Multi-sensor Fusion

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111220153A (en) * 2020-01-15 2020-06-02 西安交通大学 Positioning method based on visual topological node and inertial navigation
CN111220153B (en) * 2020-01-15 2021-10-01 西安交通大学 Positioning method based on visual topological node and inertial navigation
CN113055598A (en) * 2021-03-25 2021-06-29 浙江商汤科技开发有限公司 Orientation data compensation method and device, electronic equipment and readable storage medium

Similar Documents

Publication Publication Date Title
CA2991505C (en) Scalable indoor navigation and positioning systems and methods
EP3646062B1 (en) Three-dimensional city models and shadow mapping to improve altitude fixes in urban environments
US10295365B2 (en) State estimation for aerial vehicles using multi-sensor fusion
US9797732B2 (en) Method and apparatus for using map information aided enhanced portable navigation
US10018474B2 (en) Method and system for using offline map information aided enhanced portable navigation
US8977494B2 (en) Method and apparatus for identification of points of interest within a predefined area
CN105737826B (en) Pedestrian's indoor orientation method
US9918203B2 (en) Correcting in-venue location estimation using structural information
Real Ehrlich et al. Indoor localization for pedestrians with real-time capability using multi-sensor smartphones
CN111025366B (en) Grid SLAM navigation system and method based on INS and GNSS
CN113063425B (en) Vehicle positioning method and device, electronic equipment and storage medium
WO2024027350A1 (en) Vehicle positioning method and apparatus, computer device and storage medium
CN110595464A (en) IMU and visual sensor fusion positioning method and device
CN115164936A (en) Global pose correction method and device for point cloud splicing in high-precision map manufacturing
JP2023549684A (en) Method and system for magnetic-based cooperative positioning
Lee et al. Distant object localization with a single image obtained from a smartphone in an urban environment
Xuan et al. Crowd sourcing indoor maps with mobile sensors
Praschl et al. Enabling outdoor MR capabilities for head mounted displays: a case study
Zhou et al. Indoor route and location inference using smartphone IMU sensors
CN105874352B (en) The method and apparatus of the dislocation between equipment and ship are determined using radius of turn
Zhou et al. Hardware and software design of BMW system for multi-floor localization
He et al. Vision-aided self-calibration of a wireless propagation model for crowdsourcing-based indoor localization
CN109099909B (en) Indoor positioning method based on pedestrian inertial navigation path restoration and multi-path correlation matching
Raitoharju et al. A linear state model for PDR+ WLAN positioning
CN116973957A (en) Positioning information processing method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20191220

RJ01 Rejection of invention patent application after publication