CN113465598B - Inertial integrated navigation system suitable for unmanned aerial vehicle - Google Patents

Inertial integrated navigation system suitable for unmanned aerial vehicle Download PDF

Info

Publication number
CN113465598B
CN113465598B CN202110890897.XA CN202110890897A CN113465598B CN 113465598 B CN113465598 B CN 113465598B CN 202110890897 A CN202110890897 A CN 202110890897A CN 113465598 B CN113465598 B CN 113465598B
Authority
CN
China
Prior art keywords
navigation
aerial vehicle
unmanned aerial
obstacle
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110890897.XA
Other languages
Chinese (zh)
Other versions
CN113465598A (en
Inventor
刘翠丽
殷帅
孙路
庞诚
韦博
亢凯
刘斌
崔雨波
贾瑞武
王亚鹏
王理
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Heng Heng Science And Technology Research Institute Co ltd
Original Assignee
Beijing Heng Heng Science And Technology Research Institute Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Heng Heng Science And Technology Research Institute Co ltd filed Critical Beijing Heng Heng Science And Technology Research Institute Co ltd
Priority to CN202110890897.XA priority Critical patent/CN113465598B/en
Publication of CN113465598A publication Critical patent/CN113465598A/en
Application granted granted Critical
Publication of CN113465598B publication Critical patent/CN113465598B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Navigation (AREA)

Abstract

The application discloses inertial integrated navigation system suitable for unmanned aerial vehicle includes: the environment information acquisition module is used for acquiring the wind speed and the wind direction angle at the current moment in the navigation process of the unmanned aerial vehicle; the image acquisition module is used for acquiring an unmanned aerial vehicle visual image in the unmanned aerial vehicle navigation process; the inertial navigation module is used for acquiring an initial navigation direction and an initial navigation speed in the unmanned aerial vehicle navigation process based on a preset navigation period and an inertial navigation algorithm; the navigation correction module is used for determining the distance between the obstacle in the visual image of the unmanned aerial vehicle and the unmanned aerial vehicle, and correcting the initial navigation direction and the initial navigation speed by combining the wind speed and the wind direction angle; and the navigation control module is used for generating an unmanned aerial vehicle navigation instruction according to the corrected navigation direction and the corrected navigation speed. Through the technical scheme in this application, based on unmanned aerial vehicle navigation in-process barrier and wind speed, wind direction, carry out navigation correction, improve unmanned aerial vehicle inertial navigation's precision, reduce navigation algorithm's complexity.

Description

Inertial integrated navigation system suitable for unmanned aerial vehicle
Technical Field
The application relates to the technical field of unmanned aerial vehicle navigation, in particular to an inertial integrated navigation system suitable for an unmanned aerial vehicle.
Background
The inertial navigation system (Inertial Navigation System, INS) belongs to a dead reckoning navigation mode, and the basic working principle is based on newton's law of mechanics, and the inertial navigation system integrates time by measuring acceleration of a carrier in an inertial reference system and transforms the acceleration into a navigation coordinate system to obtain information such as speed, yaw angle and position in the navigation coordinate system. However, due to the inherent accumulation of navigation errors, the navigation accuracy is reduced with the increase of time, so that other navigation modes, such as polarized light assistance and visual assistance, are generally required to be combined in the autonomous flight navigation process of the unmanned aerial vehicle.
With the continuous development of unmanned aerial vehicle technology, the application fields that unmanned aerial vehicle relates to are more and more extensive, for example: aerial photography, environmental monitoring, logistics transportation and the like, therefore, the unmanned aerial vehicle is required to be suitable for complex flight environments in the autonomous flight navigation process, and particularly in urban areas where high-rise forests are located.
In the existing unmanned aerial vehicle navigation technology, the following problems generally exist:
1. the algorithm is complex, so that the operation rate of the algorithm is guaranteed in order to timely adjust the route in the navigation process of the unmanned aerial vehicle, a large amount of hardware resources are required to be occupied, the unmanned aerial vehicle is high in cost, and the popularization of unmanned aerial vehicle application is not facilitated;
2. poor real-time performance, is easily influenced by external environment factors, such as: wind speed, wind direction, obstacles and the like, particularly for the obstacles on the flight route, an autonomous obstacle avoidance algorithm is generally required to be embedded in a conventional navigation method, so that the complexity of the unmanned aerial vehicle navigation algorithm is further increased, and real-time adjustment of the navigational speed and the navigational direction of the unmanned aerial vehicle is not facilitated.
Disclosure of Invention
The purpose of the present application is: based on the obstacles, wind speed and wind direction in the navigation route of the unmanned aerial vehicle, the navigation direction and the navigation speed in the inertial navigation process of the unmanned aerial vehicle are corrected, the inertial navigation precision of the unmanned aerial vehicle is improved, the complexity of a navigation algorithm is reduced, and the real-time performance of the navigation control of the unmanned aerial vehicle is improved.
The technical scheme of the application is as follows: there is provided an inertial integrated navigation system for an unmanned aerial vehicle, the system comprising: the navigation system comprises an environment information acquisition module, an image acquisition module, an inertial navigation module, a navigation correction module and a navigation control module; the environment information acquisition module is used for acquiring the wind speed and the wind direction angle at the current moment in the navigation process of the unmanned aerial vehicle; the image acquisition module is used for acquiring an unmanned aerial vehicle visual image in the unmanned aerial vehicle navigation process; the inertial navigation module is used for acquiring an initial navigation direction and an initial navigation speed in the unmanned aerial vehicle navigation process based on a preset navigation period and an inertial navigation algorithm; the navigation correction module is used for determining the distance between the obstacle in the visual image of the unmanned aerial vehicle and the unmanned aerial vehicle, and correcting the initial navigation direction and the initial navigation speed by combining the wind speed and the wind direction angle;
a navigation correction module, comprising: the device comprises a contour recognition module, a distance calculation module and a contour change calculation module; the profile recognition module is used for recognizing a first obstacle profile in the unmanned aerial vehicle visual image in a first preset navigation period in an image segmentation mode, and recognizing a second obstacle profile in the unmanned aerial vehicle visual image in a second preset navigation period in an image segmentation mode; the distance calculation module is used for extracting a first pixel characteristic point and a second pixel characteristic point in the first obstacle outline, calculating a first outline distance of the obstacle based on the first pixel characteristic point and the second pixel characteristic point, and extracting a third pixel characteristic point and a fourth pixel characteristic point in the second obstacle outline based on the first pixel characteristic point and the second pixel characteristic point to calculate a second outline distance of the obstacle, wherein the first outline distance is the distance between the first pixel characteristic point and the second pixel characteristic point, and the second outline distance is the distance between the third pixel characteristic point and the fourth pixel characteristic point; the contour change calculation module is used for calculating the difference value between the first contour distance and the second contour distance and recording the difference value as the value of the contour change;
a navigation correction module, comprising: a first correction module; the first correction module is used for determining a position point of the obstacle, which is closest to the unmanned aerial vehicle, when the distance between the unmanned aerial vehicle and the obstacle is smaller than or equal to a preset course adjustment distance, marking the position point as a round point, generating a course adjustment circle by taking the preset course adjustment distance as a radius, generating an inscribed polygon of the course adjustment circle by taking an inscribed edge length, selecting a course correction edge, calculating an included angle between the course correction edge and the current navigation direction, marking the included angle as a course adjustment angle, and correcting the initial navigation direction based on the course adjustment angle, wherein the inscribed edge length is the product of the current navigation speed of the unmanned aerial vehicle and the preset navigation period length, and the course correction edge is the inscribed edge length which is closest to the unmanned aerial vehicle in the inscribed polygon and is in the direction far away from the obstacle; the first correction module is also used for calculating a second navigational speed correction value based on the vector sum of the course adjustment angle and the wind direction angle and combining the wind speed and the historical track, and correcting the initial navigational speed according to the second navigational speed correction value;
the navigation correction module further comprises: a second correction module; the second correction module is used for calculating a first navigational speed correction value based on the wind direction angle and the wind speed when the distance between the unmanned aerial vehicle and the obstacle is larger than the preset course adjustment distance, and correcting the initial navigational speed according to the first navigational speed correction value;
and the navigation control module is used for generating an unmanned aerial vehicle navigation instruction according to the corrected navigation direction and the corrected navigation speed.
In any one of the above technical solutions, further, the distance between the obstacle and the unmanned aerial vehicle in the unmanned aerial vehicle visual image is determined by the current navigational speed of the unmanned aerial vehicle and the value of the contour change of the obstacle, wherein the value of the contour change is the height difference value of the same position of the obstacle in the unmanned aerial vehicle visual image in two adjacent preset navigation periods.
In any of the above solutions, further, the system further includes: a camera; the camera is electrically connected to the image acquisition module, and is a monocular camera or a binocular camera, and the camera is used for shooting visual images of the unmanned aerial vehicle.
The beneficial effects of this application are:
according to the technical scheme, through combining inertial navigation, visual processing technology and environmental information (such as wind speed and wind direction) in the navigation process, the visual processing technology is utilized, the distance between the unmanned aerial vehicle and the obstacle is calculated by identifying the outline of the obstacle in the visual image of the unmanned aerial vehicle, the calculated distance between the wind speed and the wind direction and the obstacle is combined, the initial navigation direction and the initial navigation speed of inertial navigation output are corrected, the accuracy of inertial navigation of the unmanned aerial vehicle is improved, the complexity of a navigation algorithm is reduced, the real-time performance of navigation control of the unmanned aerial vehicle is improved, and the cruising ability of the unmanned aerial vehicle is optimized.
The recognition of the obstacle outline is achieved through multiple division and combination of the unmanned aerial vehicle visual image, the unmanned aerial vehicle visual image is divided into multiple larger grids through equidistant coarse division, so that quick operation is conducted on the basis of the larger grids with the small number and the large number of the pixels, and the image recognition efficiency is improved. On the basis, similar distances are introduced, pixel points in a larger first image grid are classified, secondary division of the unmanned aerial vehicle visual image is achieved, a second image grid is generated, the pixel points in the unmanned aerial vehicle visual image are classified to center pixel points closest to the similar distances, the problems of over-segmentation and under-segmentation in the image segmentation process can be solved through the two image divisions, the number of the divided grids is reduced on the premise of guaranteeing the image segmentation accuracy, and the accuracy and the efficiency of obstacle contour recognition are improved.
According to the technical scheme, in the process of correcting the navigational speed, in order to avoid collision between the unmanned aerial vehicle and the obstacle in advance, the navigational direction is corrected according to the distance between the unmanned aerial vehicle and the obstacle and the preset course adjustment distance, the second navigational speed correction value is obtained based on the vector sum of the course adjustment angle and the wind direction angle of the corrected navigational direction, the correction of the initial navigational speed based on the distance between the unmanned aerial vehicle and the obstacle, the course adjustment angle, the wind direction angle, the wind speed, the historical track and the like is achieved, the navigational accuracy of the unmanned aerial vehicle is improved while the navigational accuracy of the navigational speed is guaranteed, and the real-time performance of navigational control of the unmanned aerial vehicle is optimized.
Drawings
The advantages of the foregoing and/or additional aspects of the present application will become apparent and readily appreciated from the description of the embodiments, taken in conjunction with the accompanying drawings, wherein:
FIG. 1 is a schematic block diagram of an inertial integrated navigation system suitable for use with a drone, according to one embodiment of the present application;
FIG. 2 is a schematic illustration of a change in the outline of an obstacle in a visual image of a drone according to one embodiment of the present application;
fig. 3 is a schematic diagram of four connected domains according to one embodiment of the present application.
Detailed Description
In order that the above-recited objects, features and advantages of the present application will be more clearly understood, a more particular description of the application will be rendered by reference to the appended drawings and appended detailed description. It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application, however, the present application may be practiced otherwise than as described herein, and thus the scope of the present application is not limited to the specific embodiments disclosed below.
As shown in fig. 1, the present embodiment provides an inertial integrated navigation system 100 suitable for an unmanned aerial vehicle, where the system 100 includes: the unmanned aerial vehicle navigation system comprises an environment information acquisition module 10, an image acquisition module 20, an inertial navigation module 30, a navigation correction module 40 and a navigation control module 50, wherein the environment information acquisition module 10 is used for acquiring the wind speed and the wind direction angle of the unmanned aerial vehicle at the current moment in the navigation process; the image acquisition module 20 is used for acquiring the visual image of the unmanned aerial vehicle in the navigation process of the unmanned aerial vehicle and determining the distance between the unmanned aerial vehicle and the obstacle.
It should be noted that, the implementation manner of the environmental information acquisition module 10 and the image acquisition module 20 is not limited in this embodiment, where the environmental information acquisition module 10 may be implemented by a conventional module having a data receiving function, and the image acquisition module 20 is implemented based on a module having a conventional image processing function.
Further, the image obtaining module 20 is externally connected with a camera, which may be a monocular camera or a binocular camera, or may be other devices with an image capturing function, where the camera transmits the captured image to the image obtaining module 20, and the image obtaining module 20 performs processing, such as filtering and binarization processing, to obtain the visual image of the unmanned aerial vehicle during the navigation of the unmanned aerial vehicle.
The inertial navigation module 30 is configured to obtain an initial navigation direction and an initial navigation speed in the unmanned aerial vehicle navigation process based on a preset navigation period and an inertial navigation algorithm;
specifically, the embodiment sets a preset navigation period of unmanned aerial vehicle navigation based on an inertial navigation method of the unmanned aerial vehicle, periodically corrects the unmanned aerial vehicle inertial navigation, and carries a monocular camera or a binocular camera on the unmanned aerial vehicle as an acquisition device of an unmanned aerial vehicle visual image in the unmanned aerial vehicle navigation process, and carries a wind speed and direction detection or acquisition device as an environmental information acquisition module 10 to acquire parameters such as a wind speed and a wind direction angle in the unmanned aerial vehicle navigation process, so as to correct an initial navigation direction and an initial navigation speed obtained by the inertial navigation based on an obstacle and a wind speed and a wind direction angle in the unmanned aerial vehicle visual image.
Note that, the embodiment is not limited to a specific implementation manner of the inertial navigation method.
The navigation correction module 40 is configured to determine a distance between an obstacle and an unmanned aerial vehicle in a visual image of the unmanned aerial vehicle, and correct an initial navigation direction and an initial navigation speed by combining a wind speed and a wind direction angle, wherein the distance between the obstacle and the unmanned aerial vehicle in the visual image of the unmanned aerial vehicle is determined by a value of a current speed of the unmanned aerial vehicle and a value of a profile change of the obstacle, and the value of the profile change is a height difference value of the same position of the obstacle in the visual image of the unmanned aerial vehicle in two adjacent preset navigation periods.
Specifically, in order to enable the unmanned aerial vehicle to fly autonomously safely and reliably in a complex flight environment, the initial navigation direction and the initial navigation speed obtained by inertial navigation of the unmanned aerial vehicle are corrected according to the distance between the unmanned aerial vehicle and an obstacle, the wind speed and the wind direction angle. When course correction is carried out, an obstacle in a safe navigation range is identified based on the visual image of the unmanned aerial vehicle, and when the unmanned aerial vehicle is judged to be too close to the obstacle, the navigation direction correction is carried out in a direction away from the obstacle, so that the unmanned aerial vehicle is prevented from collision. And correcting the navigation speed of the unmanned aerial vehicle according to a preset function by combining the wind speed and the wind direction angle.
It should be noted that, in the navigation process of the unmanned aerial vehicle, since the size of the obstacle and the viewing angle range of the unmanned aerial vehicle for obtaining the visual image of the unmanned aerial vehicle are constant values, as shown in fig. 2, based on the trigonometric function relationship, the distance between the unmanned aerial vehicle and the obstacle can be calculated according to the value of the change size of the outline of the obstacle, and the specific derivation process is not repeated in this embodiment.
Through the scheme, the unmanned aerial vehicle is far away from the obstacle in the route in the navigation process, so that navigation safety is guaranteed, on the other hand, wind power in the navigation process is fully utilized, navigation speed correction is carried out based on the wind speed and the wind direction angle, electric energy consumption in the navigation process of the unmanned aerial vehicle is reduced, and navigation time of the unmanned aerial vehicle is prolonged.
On the basis of the above embodiment, in order to accurately correct inertial navigation of an unmanned aerial vehicle, accurately calculate a distance between the unmanned aerial vehicle and an obstacle in the navigation process of the unmanned aerial vehicle, in this embodiment, in the process of determining the change of the profile of the obstacle by the navigation correction module 40, the navigation correction module 40 specifically includes:
the profile recognition module 41 is configured to recognize a first obstacle profile in the visual image of the unmanned aerial vehicle in a first preset navigation period by adopting an image segmentation manner, where the first preset navigation period is a previous preset navigation period in any two adjacent preset navigation periods in the unmanned aerial vehicle navigation process, and is set to be T1; similarly, the second preset navigation period is the next preset navigation period after any two adjacent preset navigation periods, and is set to be T2.
In this embodiment, a conventional image segmentation and recognition processing method may be used to determine the outline of the obstacle in the visual image of the unmanned aerial vehicle.
Note that, in this embodiment, the unmanned aerial vehicle visual image is a gray value image.
Further, in the process of identifying the first obstacle outline E1 in the unmanned aerial vehicle visual image, image segmentation is performed on the unmanned aerial vehicle visual image so as to extract and identify the first obstacle outline E1 in the unmanned aerial vehicle visual image, so as to improve the accuracy of outline identification, and the process specifically comprises the following steps:
step A, performing equidistant rough division on an unmanned aerial vehicle visual image based on a first preset range, generating a first image grid, and calculating the similar distance of each pixel point in the first image grid;
specifically, in order to improve efficiency and accuracy of obstacle contour recognition, a first preset range is determined according to the size of an unmanned aerial vehicle visual image, the unmanned aerial vehicle visual image is subjected to equidistant coarse division, the unmanned aerial vehicle visual image is divided into a plurality of larger grids, so that quick operation is performed on the basis of the larger grids with fewer numbers but more pixels, the efficiency of image recognition is improved, and the divided larger grids are recorded as first image grids.
The similar distance of each pixel point in the embodiment is determined by the pixel value and the pixel point coordinate of the pixel point, and the corresponding calculation formula is as follows:
where L (I, j) is the similar distance of the pixel point with the pixel point coordinates (I, j), I () is the pixel value (I) c ,j c ) The central pixel point is a pixel point corresponding to a geometric central point in the first image grid, omega is a compact factor, M is a search radius and is a set parameter, and the value of the central pixel point is smaller than half of the minimum side length of the first preset range.
Those skilled in the art will appreciate that: the larger the value of the compact factor omega is, the more regular the first image meshing is indicated; the smaller the value of the similar distance L (i, j), the pixel point (i, j) and the central pixel point (i) c ,j c ) The more similar.
B, adopting iterative operation, classifying the pixel points according to the similar distances of the pixel points when the similar distances of the pixel points in the visual image of the unmanned aerial vehicle under the current iteration times are smaller than the corresponding similar distances under the previous iteration times, generating a second image grid, and re-executing the step A, and calculating the similar distances between each pixel point in the second image grid and the central pixel point of the second image grid until the second image grid is not changed;
specifically, when calculating the similar distance between each pixel point and the central pixel point, there are cases that each pixel point corresponds to a plurality of similar distances, at this time, the similar distances of each pixel point need to be screened, and the minimum similar distance is reserved to be used as the final similar distance of the pixel point and compared with the similar distance in the previous iteration process.
It should be noted that, the initial value of the similarity distance value of each pixel point is infinity. Therefore, after the first image grid is obtained, at least one classification and division of the second image grid are performed when the iterative operation is performed.
Since the similar distance represents the relationship between each pixel point and the central pixel point, the pixel points can be classified to the central pixel point with the closest similar distance through the comparison of the similar distances, and then the second image grid is generated.
Thus, after one iteration, the size or boundary of the area of the second image grid can change along with the change of the classification of the pixel points, and the change of the area causes the change of the position of the central pixel point, so that the similar distance of each calculated pixel point can also change due to the change of the position of the central pixel point and the size of the area in the next iteration. When the central pixel point area is not changed any more, the position of the central pixel point is not changed, and a final second image grid is obtained.
Through the technical scheme in this application, on equidistant coarse division obtains the basis of first image net, introduce similar distance, classify the pixel in the great first image net, realize classifying the secondary of unmanned aerial vehicle vision image, generate the second image net, realize classifying the pixel in the unmanned aerial vehicle vision image to the nearest central pixel of similar distance, through dividing into first, the second image net with unmanned aerial vehicle vision image in proper order, can solve the problem of the oversegmentation that exists in the image segmentation process, lack the segmentation, under the prerequisite of guaranteeing the image segmentation accuracy, reduced the net quantity of cutting apart, help improving the accuracy and the efficiency of barrier profile recognition.
And C, calculating a local variance coefficient of each second image grid and a global variance coefficient of the unmanned aerial vehicle visual image according to the pixel value and the number of the pixel points of each pixel point in the second image grid, wherein the local variance coefficient is the ratio of the pixel variance of any second image grid to the pixel mean value of any second image grid, and the global variance coefficient is the ratio of the pixel variance of the unmanned aerial vehicle visual image to the pixel mean value of any second image grid.
Step D, when the local variance coefficient is judged to be larger than the global variance coefficient, the second image grid corresponding to the local variance coefficient is marked as an image grid to be split, equidistant refinement and splitting are carried out on the image grid to be split based on a second preset range, and the split image grid to be split and the second image grid not split form a third image grid;
e, calculating a pixel mean value in the third image grid, traversing the third image grid, and carrying out grid combination on the third image grid and the grid to be combined when the difference value between the pixel mean value of the third image grid and the pixel mean value of the grid to be combined is smaller than a combination threshold value, wherein the grid to be combined is any third image grid in four connected domains of the third image grid;
specifically, as shown in fig. 3, setting the third image grid in the current traversal process as H1, and then, the grids corresponding to the four connected domains of the third image grid are respectively third grids H2-H5, and calculating the difference between the pixel mean value of the third grid H1 and the pixel mean value of the third grids H2-H5. For the third grid H2, if the difference value of the pixel mean values between the third grid H1 and the third grid H2 is smaller than the merging threshold, merging the third grid H1 and the third grid H2; otherwise, no merging is performed. Until the traversal of all third grids is completed.
And F, generating a first obstacle outline according to the grid merging result.
In this embodiment, through the division of the second image grids, the pixel points in the unmanned aerial vehicle visual image are classified, and then the local variance coefficient is combined, the clustered pixel points are subjected to equidistant refinement and division, so as to generate a third image grid, avoid missing gray scale features of the pixel points, improve accuracy of identifying the obstacle contour, and then the third image grids with smaller difference values between pixel means are combined in a four-connected domain traversal mode, and the obstacle in the unmanned aerial vehicle visual image is identified according to the grid combination result, so that the final obstacle contour can be obtained quickly and accurately.
The distance calculating module 42 is configured to extract a first pixel feature point and a second pixel feature point in the first obstacle contour, and calculate a first contour distance in the unmanned aerial vehicle visual image based on the first pixel feature point and the second pixel feature point, where the first contour distance is a distance between the first pixel feature point and the second pixel feature point;
the contour recognition module 41 is further configured to recognize a second obstacle contour in the visual image of the unmanned aerial vehicle in a second preset navigation period by adopting an image segmentation manner;
the distance calculating module 42 is further configured to extract a third pixel feature point and a fourth pixel feature point in the second obstacle contour based on the first pixel feature point and the second pixel feature point, so as to calculate a second contour distance in the unmanned aerial vehicle visual image, where the second contour distance is a distance between the third pixel feature point and the fourth pixel feature point;
specifically, the first pixel characteristic point and the second pixel characteristic point are determined based on the position coordinates of each pixel point in the first obstacle outline. In this embodiment, the pixel feature point may be an inflection point in the obstacle outline.
Such as: if the obstacle is an obstacle with a large building equal volume, extracting pixel characteristic points based on the characteristics of the structural appearance of the building; if the obstacle is a small obstacle such as a billboard or a tree, the edge points of the outline of the obstacle can be directly used as the pixel characteristic points.
In this embodiment, a billboard is used as an obstacle, as shown in fig. 2. The unmanned aerial vehicle sails from left to right, takes the sailing direction of the unmanned aerial vehicle as an x axis, takes the horizontal direction of the unmanned aerial vehicle as a y axis, and establishes a plane rectangular coordinate system. The unmanned aerial vehicle is located at the position of the point A at the moment of a preset navigation period T1, the obstacle D is a billboard, the first obstacle outline of the obstacle D in the acquired unmanned aerial vehicle visual image is E1, the first pixel characteristic points and the second pixel characteristic points are selected as edge points at two sides of the first obstacle outline E1 in the horizontal direction, and therefore the first outline distance is the length of the obstacle D in the horizontal direction in the unmanned aerial vehicle visual image at the moment of T1.
When the preset navigation period T2 is reached, the unmanned aerial vehicle is located at a point B, at the moment, the second obstacle outline of the obstacle D in the unmanned aerial vehicle visual image is E2, and similarly, edge points on two sides of the second obstacle outline E2 in the horizontal direction are used as a third pixel characteristic point and a fourth pixel characteristic point, and the second outline distance is the length of the obstacle D in the horizontal direction in the unmanned aerial vehicle visual image at the moment T2.
The navigation speed of the unmanned aerial vehicle is kept unchanged during the preset navigation period T1-T2, and the value of the profile change size can be calculated based on the trigonometric function relation.
Note that, the dashed line in fig. 2 is the visual range of the unmanned aerial vehicle.
The contour change calculating module 43 is configured to calculate a difference between the first contour distance and the second contour distance, and record the difference as a value of the magnitude of the contour change.
On the basis of the above embodiment, in order to improve accuracy of correcting the navigation direction and the navigational speed and fully utilize wind power to improve the cruising ability of the unmanned aerial vehicle, the navigation correction module 40 in this embodiment specifically further includes:
a first correction module 44, configured to determine a location point where the distance between the unmanned aerial vehicle and the obstacle is closest when it is determined that the distance between the unmanned aerial vehicle and the obstacle is less than or equal to a preset course adjustment distance, record the location point as a dot, generate a course adjustment circle with the preset course adjustment distance as a radius, generate an inscribed polygon of the course adjustment circle with an inscribed edge length, select a course correction edge and calculate an included angle between the course correction edge and a current navigation direction, record the included angle as a course adjustment angle α, correct the initial navigation direction based on the course adjustment angle α, where the inscribed edge length is a product of a current speed of the unmanned aerial vehicle and a duration of a preset navigation cycle, and the course correction edge is an inscribed edge length in the inscribed polygon that is closest to the unmanned aerial vehicle and is in a direction away from the obstacle;
specifically, as shown in fig. 2, when the unmanned aerial vehicle navigates to the point B, by determining that the distance between the unmanned aerial vehicle and the obstacle D is equal to the preset heading adjustment distance, the track adjustment circle is generated by taking the preset heading adjustment distance as a radius and taking the vertex of the lower edge of the obstacle D as a circle center, wherein the circular arc F is a part of the track adjustment circle.
Because the point B is positioned on the track adjustment circle, the point B is taken as one vertex of the inscribed polygon, and the product of the current navigational speed of the unmanned aerial vehicle and the duration of the preset navigational cycle is taken as the inscribed side length, so that the inscribed polygon is generated. In order to make the unmanned aerial vehicle far away from the obstacle D, therefore, the inscribed edge length G is selected as a course correction edge, the included angle alpha between the course correction edge G and the current navigation direction is calculated and recorded as a course adjustment angle, and the initial navigation direction is corrected.
The implementation of correcting the navigation direction based on the angle in this embodiment is not limited.
And the second correction module 45 is configured to calculate a first navigational speed correction value based on a wind direction angle and a wind speed when it is determined that the distance between the unmanned aerial vehicle and the obstacle is greater than the preset heading adjustment distance, and correct the initial navigational speed according to the first navigational speed correction value.
Specifically, when the course of the unmanned aerial vehicle is not required to be corrected, the first navigational speed correction value can be calculated directly according to the wind direction angle and the wind speed. In order to improve the accuracy of the yaw rate correction, the vertical plane of the current heading is divided into eight yaw rate correction areas by taking every 45 degrees as a direction range, so that a calculation formula corresponding to the first yaw rate correction value Deltav 1 is as follows:
wherein Deltav1 is a first navigational speed correction value, v T For the current preset navigation period T moment, the wind speed and theta T The wind direction angle at the moment of the navigation period T is preset currently.
After the first navigational speed correction value Deltav 1 is obtained, correcting the initial navigational speed calculated by the current inertial navigation in a sum operation mode, so that the initial navigational speed is corrected based on wind speed, and the purpose of prolonging the duration of the unmanned aerial vehicle is achieved.
Further, for the situation that the unmanned aerial vehicle course needs to be corrected, in order to ensure the accuracy of the navigational speed correction and improve the control precision of the unmanned aerial vehicle navigation, the first correction module 44 is further configured to:
based on vector sum of the course adjustment angle and the wind direction angle, calculating a second navigational speed correction value according to the wind speed and the historical track, and correcting the initial navigational speed according to the second navigational speed correction value, wherein the calculation formula of the second navigational speed correction value is as follows:
Δv2=Δv1′+Δv T
a 1 =0.5(MaxIter-ηT)/MaxIter
wherein Deltav2 is the second navigational speed correction value, deltav1' is the wind speed correction parameter, deltav T Correcting parameters for track, v T For the current preset navigation period T moment, the wind speed and theta T For the wind direction angle at the moment of the current preset navigation period T, alpha is a course adjustment angle, T is an intermediate parameter, t=1, 2, …, eta is the number of preset navigation periods corresponding to the moment when the unmanned aerial vehicle navigates to the current moment, and a t Is an aliquot parameter, wherein a 2 、a 3 、…、a η-1 Is a as 1 ~a η Is equal to the value of a η =1,rand (t) For the t-th halving parameter a t Corresponding tth random number obeying uniform distribution in (0, 1) interval, maxIter is the maximum sailing duration of the unmanned aerial vehicle, eta T is the sailing duration of the unmanned aerial vehicle, and x is the number of times of sailing of the unmanned aerial vehicle t For the t-th track coordinate in the historical track, x η The method comprises the steps of presetting position coordinates of a navigation cycle time for the unmanned aerial vehicle at present, wherein a historical track consists of the position coordinates of the unmanned aerial vehicle when each preset navigation cycle time is reached.
After the second navigational speed correction value Deltav 2 is obtained, the initial navigational speed calculated by the current inertial navigation is corrected in a sum operation mode, so that the correction of the initial navigational speed based on the course adjustment angle, the wind direction angle, the wind speed and the historical track is realized, and the navigational control precision of the unmanned aerial vehicle is improved.
And the navigation control module 50 is configured to generate a navigation instruction of the unmanned aerial vehicle according to the corrected navigation direction and the corrected navigation speed, and repeatedly execute the above-mentioned processes until the unmanned aerial vehicle navigation is completed.
Specifically, through the above-mentioned process in this embodiment, combine the vision processing technique to discern the barrier in the unmanned aerial vehicle vision image, combine the interval with the barrier that calculates again, realize based on information such as barrier, wind speed, wind direction, to unmanned aerial vehicle inertial navigation output's navigation direction and navigation speed's correction, help improving unmanned aerial vehicle inertial navigation's precision, reduce navigation algorithm's complexity, improve unmanned aerial vehicle navigation control's instantaneity.
The technical scheme of the application is explained in detail above with reference to the accompanying drawings, and the application provides an inertial integrated navigation system suitable for an unmanned aerial vehicle, comprising: the environment information acquisition module is used for acquiring the wind speed and the wind direction angle at the current moment in the navigation process of the unmanned aerial vehicle; the image acquisition module is used for acquiring an unmanned aerial vehicle visual image in the unmanned aerial vehicle navigation process; the inertial navigation module is used for acquiring an initial navigation direction and an initial navigation speed in the unmanned aerial vehicle navigation process based on a preset navigation period and an inertial navigation algorithm; the navigation correction module is used for determining the distance between the obstacle in the visual image of the unmanned aerial vehicle and the unmanned aerial vehicle, and correcting the initial navigation direction and the initial navigation speed by combining the wind speed and the wind direction angle; and the navigation control module is used for generating an unmanned aerial vehicle navigation instruction according to the corrected navigation direction and the corrected navigation speed. Through the technical scheme in this application, based on unmanned aerial vehicle navigation in-process barrier and wind speed, wind direction, carry out navigation correction, improve unmanned aerial vehicle inertial navigation's precision, reduce navigation algorithm's complexity.
The steps in the present application may be sequentially adjusted, combined, and pruned according to actual requirements.
The units in the device can be combined, divided and pruned according to actual requirements.
Although the present application is disclosed in detail with reference to the accompanying drawings, it is to be understood that such descriptions are merely illustrative and are not intended to limit the application of the present application. The scope of the present application is defined by the appended claims and may include various modifications, alterations, and equivalents to the invention without departing from the scope and spirit of the application.

Claims (3)

1. An inertial integrated navigation system for an unmanned aerial vehicle, the system comprising: the navigation system comprises an environment information acquisition module, an image acquisition module, an inertial navigation module, a navigation correction module and a navigation control module;
the environment information acquisition module is used for acquiring the wind speed and the wind direction angle at the current moment in the navigation process of the unmanned aerial vehicle;
the image acquisition module is used for acquiring an unmanned aerial vehicle visual image in the unmanned aerial vehicle navigation process;
the inertial navigation module is used for acquiring an initial navigation direction and an initial navigation speed in the unmanned aerial vehicle navigation process based on a preset navigation period and an inertial navigation algorithm;
the navigation correction module is used for determining the distance between the obstacle in the visual image of the unmanned aerial vehicle and the unmanned aerial vehicle, and correcting the initial navigation direction and the initial navigation speed by combining the wind speed and the wind direction angle;
the navigation correction module comprises: the device comprises a contour recognition module, a distance calculation module and a contour change calculation module;
the profile recognition module is used for recognizing a first obstacle profile in the visual image of the unmanned aerial vehicle in a first preset navigation period by adopting an image segmentation mode, and,
identifying a second obstacle outline in the visual image of the unmanned aerial vehicle in a second preset navigation period by adopting an image segmentation mode;
the distance calculation module is used for extracting a first pixel characteristic point and a second pixel characteristic point in the first obstacle outline, calculating a first outline distance of the obstacle based on the first pixel characteristic point and the second pixel characteristic point,
extracting third and fourth pixel feature points in the second obstacle contour based on the first and second pixel feature points to calculate a second contour distance of the obstacle,
the first contour distance is the distance between the first pixel characteristic point and the second pixel characteristic point, and the second contour distance is the distance between the third pixel characteristic point and the fourth pixel characteristic point;
the profile change calculation module is used for calculating the difference value between the first profile distance and the second profile distance and recording the difference value as the value of the profile change;
the navigation correction module comprises: a first correction module;
the first correction module is used for determining a position point of the obstacle, which is closest to the unmanned aerial vehicle, when the distance between the unmanned aerial vehicle and the obstacle is smaller than or equal to a preset course adjustment distance, marking the position point as a round point, generating a course adjustment circle by taking the preset course adjustment distance as a radius, generating an inscribed polygon of the course adjustment circle by an inscribed edge length, selecting a course correction edge, calculating an included angle between the course correction edge and the current navigation direction, marking the included angle as a course adjustment angle, correcting the initial navigation direction based on the course adjustment angle,
the internal length of the inscribed polygon is the product of the current navigational speed of the unmanned aerial vehicle and the preset navigational period length, and the course correction edge is the length of the inscribed polygon which is nearest to the unmanned aerial vehicle and is far away from the obstacle;
the first correction module is further used for calculating a second navigational speed correction value based on the vector sum of the course adjustment angle and the wind direction angle and combining the wind speed and the historical track, and correcting the initial navigational speed according to the second navigational speed correction value;
the navigation correction module further comprises: a second correction module;
the second correction module is used for calculating a first navigational speed correction value based on the wind direction angle and the wind speed when the distance between the unmanned aerial vehicle and the obstacle is larger than the preset course adjustment distance, and correcting the initial navigational speed according to the first navigational speed correction value;
and the navigation control module is used for generating an unmanned aerial vehicle navigation instruction according to the corrected navigation direction and the corrected navigation speed.
2. The inertial integrated navigation system for a drone of claim 1, wherein a separation between an obstacle in the drone visual image and the drone is determined by a value of a current speed of the drone and a magnitude of a profile change of the obstacle,
the value of the profile change is the height difference value of the same position of the obstacle in the visual image of the unmanned aerial vehicle in two adjacent preset navigation periods.
3. Inertial integrated navigation system suitable for use in a drone according to any one of claims 1 to 2, wherein the system further comprises: a camera;
the camera is electrically connected to the image acquisition module, the camera is a monocular camera or a binocular camera, and the camera is used for shooting the visual image of the unmanned aerial vehicle.
CN202110890897.XA 2021-08-04 2021-08-04 Inertial integrated navigation system suitable for unmanned aerial vehicle Active CN113465598B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110890897.XA CN113465598B (en) 2021-08-04 2021-08-04 Inertial integrated navigation system suitable for unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110890897.XA CN113465598B (en) 2021-08-04 2021-08-04 Inertial integrated navigation system suitable for unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN113465598A CN113465598A (en) 2021-10-01
CN113465598B true CN113465598B (en) 2024-02-09

Family

ID=77883948

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110890897.XA Active CN113465598B (en) 2021-08-04 2021-08-04 Inertial integrated navigation system suitable for unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN113465598B (en)

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2008099475A1 (en) * 2007-02-14 2010-05-27 パイオニア株式会社 NAVIGATION DEVICE, NAVIGATION METHOD, AND NAVIGATION PROGRAM
CN106708066A (en) * 2015-12-20 2017-05-24 中国电子科技集团公司第二十研究所 Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation
CN107014380A (en) * 2017-05-26 2017-08-04 西安科技大学 The Combinated navigation method of vision guided navigation and inertial navigation based on aircraft
WO2018157309A1 (en) * 2017-02-28 2018-09-07 深圳市大疆创新科技有限公司 Air route correction method and device and unmanned aerial vehicle
CN108844538A (en) * 2018-05-07 2018-11-20 中国人民解放军国防科技大学 Unmanned aerial vehicle obstacle avoidance waypoint generation method based on vision/inertial navigation
CN111207745A (en) * 2020-02-20 2020-05-29 北京星际导控科技有限责任公司 Inertia measurement method suitable for vertical gyroscope of large maneuvering unmanned aerial vehicle
CN111272148A (en) * 2020-01-20 2020-06-12 江苏方天电力技术有限公司 Unmanned aerial vehicle autonomous inspection self-adaptive imaging quality optimization method for power transmission line
RU2728197C1 (en) * 2019-08-05 2020-07-28 Акционерное общество "Концерн радиостроения "Вега" Method to control a group of unmanned aerial vehicles taking into account the degree of danger of surrounding objects
CN111929718A (en) * 2020-06-12 2020-11-13 东莞市普灵思智能电子有限公司 Automatic driving object detection and positioning system and method
CN112304304A (en) * 2020-10-23 2021-02-02 国网智能科技股份有限公司 Patrol unmanned aerial vehicle, system and method suitable for transformer substation
CN112630813A (en) * 2020-11-24 2021-04-09 中国人民解放军国防科技大学 Unmanned aerial vehicle attitude measurement method based on strapdown inertial navigation and Beidou satellite navigation system
CN112799426A (en) * 2020-12-25 2021-05-14 陈南方 Unmanned aerial vehicle navigation control system and method based on big data analysis
CN112965517A (en) * 2021-01-31 2021-06-15 国网江苏省电力有限公司常州供电分公司 Unmanned aerial vehicle inspection safety obstacle avoidance system and method based on binocular vision fusion laser radar and electromagnetic field detection
CN113075712A (en) * 2021-03-17 2021-07-06 北京云恒科技研究院有限公司 Autonomous controllable multi-system high-precision navigation equipment and navigation method
CN113188557A (en) * 2021-04-28 2021-07-30 江苏方天电力技术有限公司 Visual inertial integrated navigation method fusing semantic features

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100761011B1 (en) * 2006-05-30 2007-09-21 학교법인 인하학원 Aiding inertial navigation system using a camera type sun sensor and method there of
US10393527B2 (en) * 2017-12-20 2019-08-27 National Chung Shan Institute Of Science And Technology UAV navigation obstacle avoidance system and method thereof

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2008099475A1 (en) * 2007-02-14 2010-05-27 パイオニア株式会社 NAVIGATION DEVICE, NAVIGATION METHOD, AND NAVIGATION PROGRAM
CN106708066A (en) * 2015-12-20 2017-05-24 中国电子科技集团公司第二十研究所 Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation
WO2018157309A1 (en) * 2017-02-28 2018-09-07 深圳市大疆创新科技有限公司 Air route correction method and device and unmanned aerial vehicle
CN107014380A (en) * 2017-05-26 2017-08-04 西安科技大学 The Combinated navigation method of vision guided navigation and inertial navigation based on aircraft
CN108844538A (en) * 2018-05-07 2018-11-20 中国人民解放军国防科技大学 Unmanned aerial vehicle obstacle avoidance waypoint generation method based on vision/inertial navigation
RU2728197C1 (en) * 2019-08-05 2020-07-28 Акционерное общество "Концерн радиостроения "Вега" Method to control a group of unmanned aerial vehicles taking into account the degree of danger of surrounding objects
CN111272148A (en) * 2020-01-20 2020-06-12 江苏方天电力技术有限公司 Unmanned aerial vehicle autonomous inspection self-adaptive imaging quality optimization method for power transmission line
CN111207745A (en) * 2020-02-20 2020-05-29 北京星际导控科技有限责任公司 Inertia measurement method suitable for vertical gyroscope of large maneuvering unmanned aerial vehicle
CN111929718A (en) * 2020-06-12 2020-11-13 东莞市普灵思智能电子有限公司 Automatic driving object detection and positioning system and method
CN112304304A (en) * 2020-10-23 2021-02-02 国网智能科技股份有限公司 Patrol unmanned aerial vehicle, system and method suitable for transformer substation
CN112630813A (en) * 2020-11-24 2021-04-09 中国人民解放军国防科技大学 Unmanned aerial vehicle attitude measurement method based on strapdown inertial navigation and Beidou satellite navigation system
CN112799426A (en) * 2020-12-25 2021-05-14 陈南方 Unmanned aerial vehicle navigation control system and method based on big data analysis
CN112965517A (en) * 2021-01-31 2021-06-15 国网江苏省电力有限公司常州供电分公司 Unmanned aerial vehicle inspection safety obstacle avoidance system and method based on binocular vision fusion laser radar and electromagnetic field detection
CN113075712A (en) * 2021-03-17 2021-07-06 北京云恒科技研究院有限公司 Autonomous controllable multi-system high-precision navigation equipment and navigation method
CN113188557A (en) * 2021-04-28 2021-07-30 江苏方天电力技术有限公司 Visual inertial integrated navigation method fusing semantic features

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
一种基于惯性/视觉信息融合的无人机自主着陆导航算法;刘畅;杨良军;刘峰;王翌;徐策;刘崇亮;;导航定位与授时(第06期);全文 *
刘畅 ; 杨良军 ; 刘峰 ; 王翌 ; 徐策 ; 刘崇亮 ; .一种基于惯性/视觉信息融合的无人机自主着陆导航算法.导航定位与授时.2016,(第06期),全文. *
基于GPS/INS自主耕作拖拉机导航修正研究;赵书尚,李会彬,刘斌;农机化研究;全文 *
基于速度障碍法的无人机避障与航迹恢复策略;张宏宏;甘旭升;李昂;高志强;徐鑫宇;;系统工程与电子技术(第08期);全文 *
张宏宏 ; 甘旭升 ; 李昂 ; 高志强 ; 徐鑫宇 ; .基于速度障碍法的无人机避障与航迹恢复策略.系统工程与电子技术.2020,(第08期),全文. *
无人机景象匹配视觉导航技术综述;赵春晖 等;中国科学:信息科学(05);全文 *
赵春晖 等.无人机景象匹配视觉导航技术综述.中国科学:信息科学.2019,(05),全文. *

Also Published As

Publication number Publication date
CN113465598A (en) 2021-10-01

Similar Documents

Publication Publication Date Title
CN111626217B (en) Target detection and tracking method based on two-dimensional picture and three-dimensional point cloud fusion
CN110588653B (en) Control system, control method and controller for autonomous vehicle
CN110675307B (en) Implementation method from 3D sparse point cloud to 2D grid graph based on VSLAM
Li et al. Springrobot: A prototype autonomous vehicle and its algorithms for lane detection
US8818702B2 (en) System and method for tracking objects
US11556745B2 (en) System and method for ordered representation and feature extraction for point clouds obtained by detection and ranging sensor
Ohgushi et al. Road obstacle detection method based on an autoencoder with semantic segmentation
Erbs et al. Moving vehicle detection by optimal segmentation of the dynamic stixel world
CN111213155A (en) Image processing method, device, movable platform, unmanned aerial vehicle and storage medium
EP2940656B1 (en) Vehicle periphery monitoring device
EP2887315B1 (en) Camera calibration device, method for implementing calibration, program and camera for movable body
Cao et al. Robust place recognition and loop closing in laser-based SLAM for UGVs in urban environments
CN115049700A (en) Target detection method and device
Wen et al. Research on 3D point cloud de-distortion algorithm and its application on Euclidean clustering
Sun et al. Obstacle Detection of Intelligent Vehicle Based on Fusion of Lidar and Machine Vision.
Gálai et al. Crossmodal point cloud registration in the Hough space for mobile laser scanning data
Gökçe et al. Recognition of dynamic objects from UGVs using Interconnected Neuralnetwork-based Computer Vision system
Kim et al. MOD: Multi-camera based local position estimation for moving objects detection
CN113465598B (en) Inertial integrated navigation system suitable for unmanned aerial vehicle
Huang et al. A coarse-to-fine LiDar-based SLAM with dynamic object removal in dense urban areas
CN116206286A (en) Obstacle detection method, device, equipment and medium under high-speed road condition
US20240151855A1 (en) Lidar-based object tracking
CN113390410B (en) Inertial integrated navigation method suitable for unmanned aerial vehicle
CN113554705B (en) Laser radar robust positioning method under changing scene
Chai et al. 3D gesture recognition method based on faster R-CNN network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant