CN109655058A - A kind of inertia/Visual intelligent Combinated navigation method - Google Patents

A kind of inertia/Visual intelligent Combinated navigation method Download PDF

Info

Publication number
CN109655058A
CN109655058A CN201811579143.7A CN201811579143A CN109655058A CN 109655058 A CN109655058 A CN 109655058A CN 201811579143 A CN201811579143 A CN 201811579143A CN 109655058 A CN109655058 A CN 109655058A
Authority
CN
China
Prior art keywords
navigation
visual
decision
output
inertial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811579143.7A
Other languages
Chinese (zh)
Inventor
高嘉瑜
景鑫
李阳
王健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CETC 20 Research Institute
Original Assignee
CETC 20 Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CETC 20 Research Institute filed Critical CETC 20 Research Institute
Priority to CN201811579143.7A priority Critical patent/CN109655058A/en
Publication of CN109655058A publication Critical patent/CN109655058A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

The present invention provides a kind of inertia/Visual intelligent Combinated navigation methods, when navigation specialist system receives the input information such as navigation information and abnormality detection result, it is brought into operation first by the rule in inference machine control knowledge base, read the state in current each ranging source, it generates intermediate true and is saved to integrated database, then inference machine is according to the real-time status in each ranging source, rule in knowledge base is matched, select current optimal navigation fusion mode, it activates and executes corresponding rule, choose navigation sources, the decision-making foundation that explanation engine can be explained to user and export the navigation fusion mode currently selected etc., if finally navigation fusion mode switching, decision can be carried out to the performance change trend of navigation fusion mode.The present invention carries out intelligent management and decision to vision and inertial sensor and provides navigational parameter to improve the navigation and positioning accuracy and reliability of system for all kinds of unmanned systems.

Description

Inertial/visual intelligent combined navigation method
Technical Field
The invention relates to a navigation method, and belongs to the field of integrated navigation.
Background
The inertial navigation system has the characteristics of complete autonomy, high concealment, strong anti-interference performance and information continuity, so that the inertial navigation system is one of core sensors for realizing the autonomous navigation control by a carrier. However, the error of the inertial navigation system continuously drifts along with the increase of time, and obviously, the requirement of the navigation system on accuracy, reliability and real-time performance is difficult to meet by a single navigation mode. With the continuous progress of visual sensors, image processing technologies and the like, an inertia/visual combination system is one of important development trends, and in addition, navigation parameters such as relative pose and the like are more sensitive based on visual navigation, so that the system can be used in closed or complex environments, and further improves the autonomous positioning navigation capability. When a carrier moves for a long time, a scene map is assumed, scene matching is selected for navigation, the scene matching has higher positioning accuracy in the positioning of a matching area, but long-time continuous navigation positioning information is difficult to obtain due to the discontinuity of the matching area, and inertial navigation is often used as an auxiliary means to be combined with an inertial system.
Under the condition of modern high-technology war, the fighter plane faces increasingly severe challenges and has higher and higher requirements on navigation, the fighter plane can work all the day in a severe environment to support various task requirements, and navigation positioning information with highest precision can be continuously provided under any condition. For a combined visual and inertial mode. Each navigation device has respective working principle, positioning accuracy and advantages and disadvantages, navigation information of the navigation devices is observed quantity of the platform part state, and if fusion processing is carried out on all navigation information without distinguishing, the obtained result is not optimal, and even worse than the result which is not combined.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides an inertial navigation/visual intelligent combined navigation method, which is used for intelligently managing and deciding visual and inertial sensors so as to improve the navigation positioning precision and reliability of the system and provide navigation parameters for various unmanned systems.
The technical scheme adopted by the invention for solving the technical problem comprises the following steps:
generating inertial measurement unit IMU simulation data by using a gyroscope and an accelerometer, wherein the attitude angular rate output by a carrier in flight isAngular velocity of carrier in carrier coordinate system relative to geographic coordinate systemWherein,converting a posture matrix from a geographic coordinate system to a carrier coordinate system;
ideal output of gyroscopeWhereinIs the projection of the angular velocity of the geographic system relative to the inertial system on the carrier system,
ideal output of accelerometerfnIs specific force under geographic system;
adding random errors including white noise, first-order Markov errors and random constant errors to the output of the ideal gyroscope to serve as output data of the simulated gyroscope;
adding random errors including first-order Markov errors to the output of an ideal accelerometer to serve as simulation output data of the accelerometer;
secondly, acquiring images, extracting features from the images, tracking and calculating the feature points, and calculating visual navigation parameters of a relative coordinate system by using a light velocity adjustment method; integrating IMU data between two camera frames together through pre-integration, registering a result of visual calculation and a result after IMU pre-integration, and aligning a time stamp;
inputting the inertia, visual navigation information and performance evaluation result information into an intelligent navigation expert system, controlling rules in a knowledge base to start running by an inference engine, reading the current states of all sensing sources, generating intermediate facts and storing the intermediate facts into a comprehensive database; then, the inference machine matches the rules in the knowledge base according to the real-time state of each sensing source, selects the current optimal navigation fusion mode, activates and executes the corresponding rules, and outputs the current selection decision basis to the user; the inference machine searches the knowledge base according to the input navigation source sensor information, controls the corresponding action to be activated when the set condition is met, and obtains the optimal navigation decision information until the inference is finished;
and fourthly, performing unscented Kalman filtering combination on the scene matching result and the IMU inertial data.
The invention has the beneficial effects that: an intelligent decision mode is added before the navigation information is combined, the selection strategy and the switching research of the inertial/visual integrated navigation working mode are carried out, so that the integrated navigation system can intelligently judge the single sensor and the integrated navigation state, the independent decision of the integrated navigation system is completed according to the sensor state, the integral navigation positioning performance is ensured, and the integrated navigation obtains the highest precision.
Drawings
FIG. 1 is a basic block diagram of an expert system;
FIG. 2 is a diagram of the reasoning process of the navigation expert system;
FIG. 3 is a schematic flow chart of the steps of the present invention;
fig. 4 is an intelligent decision flow diagram.
Detailed Description
The present invention will be further described with reference to the following drawings and examples, which include, but are not limited to, the following examples.
The invention provides an intelligent integrated navigation processing method based on an expert system by combining the characteristics of navigation source sensors, and explores the navigation source decision-making assisting function in the integrated navigation system realized by the rule-based expert system so as to realize the navigation source combination effect with higher positioning and attitude determination precision.
The navigation expert system is mainly composed of a comprehensive database, an inference engine and a knowledge base as shown in figure 1. When the navigation expert system receives input information such as navigation information, abnormal detection results and the like, firstly, the inference machine controls rules in the knowledge base to start running, reads the current state of each distance measurement source, generates intermediate facts and stores the intermediate facts into the comprehensive database, then, the inference machine matches the rules in the knowledge base according to the real-time state of each distance measurement source, selects the current optimal navigation fusion mode, activates and executes the corresponding rules, selects the navigation sources, the interpretation machine can interpret and output decision bases of the currently selected navigation fusion mode and the like to a user, and finally, if the navigation fusion mode is switched, decision can be made on performance change trends of the navigation fusion mode.
The most important components of the rule-based expert system are: a knowledge base, an inference engine, and a comprehensive database.
A knowledge base: the knowledge base in the text containing all the rules adopts a framework as a knowledge representation method and is based on heuristic search of a generating rule, so that the knowledge base contains a large amount of professional knowledge in the navigation field stored in a rule form.
The inference machine: the inference engine for integrally controlling the operation is a rule-based control center of an expert system, and determines which front parts of rules meet conditions, so that the back parts of corresponding rules are activated to execute corresponding operations, and the inference process is actually a search and matching process. And the inference machine searches the knowledge base according to the input navigation source data, and corresponding actions can be activated when the control meets certain conditions until the inference is finished to obtain the optimal navigation decision information. The inference engine in the research selects forward chain inference, adopts Rete mode matching algorithm, starts from the input navigation source information, uses rules to search heuristics, if the rules are matched, the rules are selected, the rule conclusion is added into the comprehensive database, if the problem is not solved completely, the inference continues, if the problem is solved completely, the inference exits. The algorithm greatly improves the reasoning efficiency and provides a quick decision reaction for the integrated navigation processing. The reasoning process is shown in figure 2.
A comprehensive database: the data comprehensive database required by the inference is used for storing various intermediate states, facts, data, initial states, targets and the like obtained in the inference process. It acts as a working memory to store the facts answered by the user, the known facts and the facts derived from reasoning, and the contents of the database may also be dynamically changed from question to question. During the inference process, the inference engine performs corresponding operations on the fact table according to the execution situation of the rule, such as deleting the fact, adding the fact, modifying the fact and the like.
The intelligent integrated navigation system mainly researches an inertia/vision integrated navigation working mode selection strategy and switching research, so that the integrated navigation system intelligently judges a single sensor and an integrated navigation state, and completes an independent decision of the integrated system according to the sensor state, thereby ensuring the integral navigation positioning performance. An inertial and visual sensor knowledge base is established, and comprehensive analysis is carried out through factors such as the working health state of a navigation source, the current task requirement, the external interference environment and the like, so that the working mode selection strategy and the switching research are completed.
The invention provides an intelligent integrated navigation method, which comprises the following specific steps:
the method comprises the following steps: generating IMU simulation data:
1.1) generating IMU simulation data of an inertial measurement unit by using the measurement principle of a gyroscope and an accelerometer, wherein the attitude angular rate output by a carrier in flight isCan obtain according to Euler angle theorem
WhereinThe angular velocity of the carrier coordinate system relative to the geographic coordinate system,and converting the attitude of the geographic coordinate system to the carrier coordinate system.
Ideal output of gyroscopeIs composed of
WhereinIs the projection of the angular velocity of the geographic system relative to the inertial system on the carrier system,
the ideal output of the accelerometer is
fnIs specific force under geographic system.
1.2) adding random errors (white noise, first-order Markov errors, random constant errors) to the output of the ideal gyroscope as the simulated gyroscope output data.
1.3) adding random error (first order Markov error) to the output of the ideal accelerometer as the accelerometer simulation output data.
Step two: and acquiring visual data, namely acquiring and storing images by adopting a camera to generate visual navigation data.
Step three: IMU and vision sensor initialization:
3.1) the processing of the visual data is to extract the characteristics From the image and track and calculate the characteristic points, and the relative coordinate system visual navigation parameters can be calculated by using the light velocity adjustment method based on the Structure From Motion (SFM) technology.
3.2) because the frequency of IMU data acquisition is far higher than the frame rate of the camera, in order to realize data fusion, time stamp alignment needs to be considered, and pre-integration is to integrate IMU data between two camera frames.
3.3) registering the result of the visual calculation and the result of IMU pre-integration by using the preprocessed data, aligning the time stamps of the results and initializing the results.
Step four: intelligent decision system construction
The inertial/visual combined navigation system combines inertial and visual navigation source sensors, the working states, real-time performance, geometric distribution and the like of the two sensors are different, management and decision-making are needed to be carried out on the sensors, and the current best fusion mode is utilized for combination, so that the navigation positioning precision and reliability of the system are improved.
4.1) selecting strategies and switching aiming at the navigation working mode, comprehensively analyzing the working health state of the navigation source, the current task requirement and the external interference environment, and giving judgment or decision results of the navigation source management, the fusion mode and the information output mode as performance evaluation results. The intelligent decision flow diagram is shown in fig. 3.
4.2) inputting the information of inertia, visual navigation information and performance evaluation results (abnormal detection results and the like) into an intelligent navigation expert system,
4.3) firstly, the inference engine controls the rules in the knowledge base to start running, reads the current states of all the sensing sources, generates intermediate facts and stores the intermediate facts into the comprehensive database,
and 4.4) matching the rules in the knowledge base by the inference machine according to the real-time state of each sensing source, selecting the current optimal navigation fusion mode, activating and executing the corresponding rules, explaining and outputting the current selected decision basis to a user by the interpretation machine, and finally deciding the performance change trend of the navigation fusion mode if the navigation fusion mode is switched.
4.5) the inference machine searches the knowledge base according to the input navigation source sensor information, corresponding actions can be activated when the control meets certain conditions until the inference is finished to obtain the optimal navigation decision information, and the inference process is actually a search and matching process.
Step five: inertia/vision combined simulation: and (4) performing a loose coupling combination mode, namely performing unscented Kalman filtering combination on the scene matching result and IMU inertial data.
The embodiment of the invention utilizes the IMU working principle to generate the needed simulation data, then initializes the IMU and the visual sensor, then carries out intelligent decision of a combination mode, and finally carries out inertia/visual combination simulation analysis.
The specific implementation steps are as follows:
the method comprises the following steps: generating IMU simulation data:
1.1) generating IMU simulation data of an inertial measurement unit by using the measurement principle of a gyroscope and an accelerometer, wherein the attitude angular rate output by a missile carrier during flying isCan obtain according to Euler angle theorem
WhereinThe angular velocity of the carrier coordinate system relative to the geographic coordinate system,and converting the attitude of the geographic coordinate system to the carrier coordinate system.
Ideal output of gyroscopeIs composed of
WhereinIs the projection of the angular velocity of the geographic system relative to the inertial system on the carrier system,
the ideal output of the accelerometer is
fnIs specific force under geographic system.
1.2) adding random errors (white noise, first-order Markov errors, random constant errors) to the output of the ideal gyroscope as the simulated gyroscope output data.
1.3) adding random error (first order Markov error) to the output of the ideal accelerometer as the accelerometer simulation output data.
Step two: and acquiring visual data, namely acquiring and storing images by adopting a camera to generate visual navigation data.
Step three: IMU and vision sensor initialization:
3.1) the processing of the visual data is to extract the characteristics From the image and track and calculate the characteristic points, and the relative coordinate system visual navigation parameters can be calculated by using the light velocity adjustment method based on the Structure From Motion (SFM) technology.
3.2) because the frequency of IMU data acquisition is far higher than the frame rate of the camera, in order to realize data fusion, time stamp alignment needs to be considered, and pre-integration is to integrate IMU data between two camera frames. The relative variation quantity of the attitude, the speed and the position between two key frames can be expressed by the measurement data of the IMU and is irrelevant to the optimally estimated state quantity, so that the pre-integration quantity can be calculated only once in the iterative optimization process, and the state quantity of the next key frame can be calculated by using the relative variation quantity only when the state is updated, so that the calculation quantity can be greatly reduced, and the running speed on an embedded system is improved.
3.3) registering the result of the visual calculation and the result of IMU pre-integration by using the preprocessed data, aligning the time stamps of the results and initializing the results. Corresponding three-dimensional information can be recovered from the two-dimensional image sequence during registration, wherein the three-dimensional information comprises the motion parameters of the imaging camera and the structural information of the scene, so that a very basic state estimation is obtained.
Step four: intelligent decision system construction
The inertial/visual combined navigation system combines inertial and visual navigation source sensors, the working states, real-time performance, geometric distribution and the like of the two sensors are different, management and decision-making are needed to be carried out on the sensors, and the current best fusion mode is utilized for combination, so that the navigation positioning precision and reliability of the system are improved.
4.1) because the navigation positioning characteristics of inertia and vision sensors are different, the navigation sources need to be abstracted, and the description of each navigation source is established. And selecting strategies and switching aiming at the navigation working mode, comprehensively analyzing factors such as the working health state of the navigation source, the current task requirement, the external interference environment and the like, and giving judgment or decision results such as navigation source management, fusion mode, information output mode and the like. The intelligent decision flow diagram is shown in fig. 4.
4.2) inputting the information of inertia, visual navigation information and performance evaluation results (abnormal detection results and the like) into an intelligent navigation expert system,
4.3) firstly, the inference engine controls the rules in the knowledge base to start running, reads the current states of all the sensing sources, generates intermediate facts and stores the intermediate facts into the comprehensive database,
and 4.4) matching the rules in the knowledge base by the inference machine according to the real-time state of each sensing source, selecting the current optimal navigation fusion mode, activating and executing the corresponding rules, explaining and outputting the current selected decision basis to a user by the interpretation machine, and finally deciding the performance change trend of the navigation fusion mode if the navigation fusion mode is switched.
4.5) the inference machine searches the knowledge base according to the input navigation source sensor information, corresponding actions can be activated when the control meets certain conditions until the inference is finished to obtain the optimal navigation decision information, and the inference process is actually a search and matching process. The inference engine has the main functions of reasoning and judging the real-time state of each navigation source sensor according to relevant facts and rules, and selecting the currently available optimal fusion mode, so that the fusion mode decision of the system is realized. The inference mechanism relies on knowledge to proceed. The knowledge is stored in a knowledge base in a specific form, and the storage mode is convenient for a user to analyze and process the working state, the performance, the geometric distribution and other states of the navigation source sensor, so that the navigation source management and the decision reasoning are realized.
Step five: inertia/vision combined simulation: and (4) performing a loose coupling combination mode, namely performing unscented Kalman filtering combination on the scene matching result and IMU inertial data.
Calibration example
Setting the flight time of an unmanned aerial vehicle for 2 hours, the flight speed of 300m/s, the inertial navigation positioning precision of 1n mile/h (CEP), the inertial navigation updating frequency of 200Hz, the positioning precision of a scene matching area of 1m (CEP), considering that the scene matching calculation amount is large, the scene matching positioning updating period of 1s, in a scene matching failure area, the system only works in a pure inertial mode, performing recursion operation on a system error array at the moment, ensuring the continuity of navigation information, after the scene matching function is recovered, the whole navigation system is in a combined working mode, and when the scene matching updating period of 1s, the combined navigation positioning precision is superior to 2.5 m.
In practical application, scene matching is not always effective, so that the combined navigation positioning precision can still be ensured by considering how long scene matching is unavailable, and reference significance is provided for planning the interval length of a matching interval and selecting a ground scene.
And under the condition of meeting the requirement of the combined positioning index, adjusting the scene matching updating period, and adjusting the parameters to enable the inertial/visual combined positioning result to be larger than 5m, wherein the time interval at the moment can be regarded as the maximum time interval of scene matching, namely the farthest distance of the interval of the scene matching area.
The simulation is carried out under the set conditions, when the combination period is less than 10s, namely the flight distance is 3km, the combined navigation positioning precision is superior to 5m (CEP), namely the combined positioning precision in the whole process can be ensured by scene matching when the flight distance is 3 km.
The above examples are only for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (1)

1. An inertial/visual intelligent integrated navigation method is characterized by comprising the following steps:
generating inertial measurement unit IMU simulation data by using a gyroscope and an accelerometer, wherein the attitude angular rate output by a carrier in flight isAngular velocity of carrier in carrier coordinate system relative to geographic coordinate systemWherein,converting a posture matrix from a geographic coordinate system to a carrier coordinate system;
ideal output of gyroscopeWhereinIs the projection of the angular velocity of the geographic system relative to the inertial system on the carrier system,
ideal output of accelerometerfnIs specific force under geographic system;
adding random errors including white noise, first-order Markov errors and random constant errors to the output of the ideal gyroscope to serve as output data of the simulated gyroscope;
adding random errors including first-order Markov errors to the output of an ideal accelerometer to serve as simulation output data of the accelerometer;
secondly, acquiring images, extracting features from the images, tracking and calculating the feature points, and calculating visual navigation parameters of a relative coordinate system by using a light velocity adjustment method; integrating IMU data between two camera frames together through pre-integration, registering a result of visual calculation and a result after IMU pre-integration, and aligning a time stamp;
inputting the inertia, visual navigation information and performance evaluation result information into an intelligent navigation expert system, controlling rules in a knowledge base to start running by an inference engine, reading the current states of all sensing sources, generating intermediate facts and storing the intermediate facts into a comprehensive database; then, the inference machine matches the rules in the knowledge base according to the real-time state of each sensing source, selects the current optimal navigation fusion mode, activates and executes the corresponding rules, and outputs the current selection decision basis to the user; the inference machine searches the knowledge base according to the input navigation source sensor information, controls the corresponding action to be activated when the set condition is met, and obtains the optimal navigation decision information until the inference is finished;
and fourthly, performing unscented Kalman filtering combination on the scene matching result and the IMU inertial data.
CN201811579143.7A 2018-12-24 2018-12-24 A kind of inertia/Visual intelligent Combinated navigation method Pending CN109655058A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811579143.7A CN109655058A (en) 2018-12-24 2018-12-24 A kind of inertia/Visual intelligent Combinated navigation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811579143.7A CN109655058A (en) 2018-12-24 2018-12-24 A kind of inertia/Visual intelligent Combinated navigation method

Publications (1)

Publication Number Publication Date
CN109655058A true CN109655058A (en) 2019-04-19

Family

ID=66115466

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811579143.7A Pending CN109655058A (en) 2018-12-24 2018-12-24 A kind of inertia/Visual intelligent Combinated navigation method

Country Status (1)

Country Link
CN (1) CN109655058A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111065043A (en) * 2019-10-25 2020-04-24 重庆邮电大学 System and method for fusion positioning of vehicles in tunnel based on vehicle-road communication
CN113031040A (en) * 2021-03-01 2021-06-25 宁夏大学 Positioning method and system for airport ground clothes vehicle
CN113587975A (en) * 2020-04-30 2021-11-02 伊姆西Ip控股有限责任公司 Method, apparatus and computer program product for managing application environments
CN113949999A (en) * 2021-09-09 2022-01-18 之江实验室 Indoor positioning navigation equipment and method
CN116531690A (en) * 2023-06-25 2023-08-04 中国人民解放军63863部队 Forest fire extinguishing bomb with throwing device and throwing control method

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160033280A1 (en) * 2014-08-01 2016-02-04 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable earpiece for providing social and environmental awareness
US20160305784A1 (en) * 2015-04-17 2016-10-20 Regents Of The University Of Minnesota Iterative kalman smoother for robust 3d localization for vision-aided inertial navigation
CN106446815A (en) * 2016-09-14 2017-02-22 浙江大学 Simultaneous positioning and map building method
CN106679648A (en) * 2016-12-08 2017-05-17 东南大学 Vision-inertia integrated SLAM (Simultaneous Localization and Mapping) method based on genetic algorithm
CN106708066A (en) * 2015-12-20 2017-05-24 中国电子科技集团公司第二十研究所 Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation
CN106840196A (en) * 2016-12-20 2017-06-13 南京航空航天大学 A kind of strap-down inertial computer testing system and implementation method
CN107014371A (en) * 2017-04-14 2017-08-04 东南大学 UAV integrated navigation method and apparatus based on the adaptive interval Kalman of extension
CN107796391A (en) * 2017-10-27 2018-03-13 哈尔滨工程大学 A kind of strapdown inertial navigation system/visual odometry Combinated navigation method
CN108375370A (en) * 2018-07-02 2018-08-07 江苏中科院智能科学技术应用研究院 A kind of complex navigation system towards intelligent patrol unmanned plane
CN108731670A (en) * 2018-05-18 2018-11-02 南京航空航天大学 Inertia/visual odometry combined navigation locating method based on measurement model optimization

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160033280A1 (en) * 2014-08-01 2016-02-04 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable earpiece for providing social and environmental awareness
US20160305784A1 (en) * 2015-04-17 2016-10-20 Regents Of The University Of Minnesota Iterative kalman smoother for robust 3d localization for vision-aided inertial navigation
CN106708066A (en) * 2015-12-20 2017-05-24 中国电子科技集团公司第二十研究所 Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation
CN106446815A (en) * 2016-09-14 2017-02-22 浙江大学 Simultaneous positioning and map building method
CN106679648A (en) * 2016-12-08 2017-05-17 东南大学 Vision-inertia integrated SLAM (Simultaneous Localization and Mapping) method based on genetic algorithm
CN106840196A (en) * 2016-12-20 2017-06-13 南京航空航天大学 A kind of strap-down inertial computer testing system and implementation method
CN107014371A (en) * 2017-04-14 2017-08-04 东南大学 UAV integrated navigation method and apparatus based on the adaptive interval Kalman of extension
CN107796391A (en) * 2017-10-27 2018-03-13 哈尔滨工程大学 A kind of strapdown inertial navigation system/visual odometry Combinated navigation method
CN108731670A (en) * 2018-05-18 2018-11-02 南京航空航天大学 Inertia/visual odometry combined navigation locating method based on measurement model optimization
CN108375370A (en) * 2018-07-02 2018-08-07 江苏中科院智能科学技术应用研究院 A kind of complex navigation system towards intelligent patrol unmanned plane

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
程林: "专家系统在组合导航辅助决策中的应用研究", 《科技视界》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111065043A (en) * 2019-10-25 2020-04-24 重庆邮电大学 System and method for fusion positioning of vehicles in tunnel based on vehicle-road communication
CN113587975A (en) * 2020-04-30 2021-11-02 伊姆西Ip控股有限责任公司 Method, apparatus and computer program product for managing application environments
CN113031040A (en) * 2021-03-01 2021-06-25 宁夏大学 Positioning method and system for airport ground clothes vehicle
CN113949999A (en) * 2021-09-09 2022-01-18 之江实验室 Indoor positioning navigation equipment and method
CN113949999B (en) * 2021-09-09 2024-01-30 之江实验室 Indoor positioning navigation equipment and method
CN116531690A (en) * 2023-06-25 2023-08-04 中国人民解放军63863部队 Forest fire extinguishing bomb with throwing device and throwing control method
CN116531690B (en) * 2023-06-25 2023-10-20 中国人民解放军63863部队 Forest fire extinguishing bomb with throwing device and throwing control method

Similar Documents

Publication Publication Date Title
CN110556012B (en) Lane positioning method and vehicle positioning system
CN109655058A (en) A kind of inertia/Visual intelligent Combinated navigation method
CN106840148B (en) Wearable positioning and path guiding method based on binocular camera under outdoor working environment
CN113916243B (en) Vehicle positioning method, device, equipment and storage medium for target scene area
CN110377025A (en) Sensor aggregation framework for automatic driving vehicle
US9071829B2 (en) Method and system for fusing data arising from image sensors and from motion or position sensors
CN109084732A (en) Positioning and air navigation aid, device and processing equipment
CN109887053A (en) A kind of SLAM map joining method and system
CN109341706A (en) A kind of production method of the multiple features fusion map towards pilotless automobile
CN110345955A (en) Perception and planning cooperation frame for automatic Pilot
CN110389580A (en) Method for planning the drift correction in the path of automatic driving vehicle
CN111959495B (en) Vehicle control method and device and vehicle
CN114526745B (en) Drawing construction method and system for tightly coupled laser radar and inertial odometer
CN107888828A (en) Space-location method and device, electronic equipment and storage medium
CN107478220A (en) Unmanned plane indoor navigation method, device, unmanned plane and storage medium
CN109196432A (en) Speed control parameter estimation method for automatic driving vehicle
CN106030430A (en) Multi-sensor fusion for robust autonomous filght in indoor and outdoor environments with a rotorcraft micro-aerial vehicle (MAV)
JP2019518222A (en) Laser scanner with real-time on-line egomotion estimation
CN109461208A (en) Three-dimensional map processing method, device, medium and calculating equipment
CN110726406A (en) Improved nonlinear optimization monocular inertial navigation SLAM method
CN104280022A (en) Digital helmet display device tracking system of visual-aided inertial measuring unit
CN110096054A (en) For using multiple threads to generate the method and system of the reference line for automatic driving vehicle
CN113934205B (en) Method, apparatus, device and storage medium for controlling guiding robot
CN114088087B (en) High-reliability high-precision navigation positioning method and system under unmanned aerial vehicle GPS-DENIED
CN113034594A (en) Pose optimization method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20190419