CN116625372A - Multi-source unmanned aerial vehicle navigation method and device, storage medium and electronic equipment - Google Patents

Multi-source unmanned aerial vehicle navigation method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN116625372A
CN116625372A CN202310595059.9A CN202310595059A CN116625372A CN 116625372 A CN116625372 A CN 116625372A CN 202310595059 A CN202310595059 A CN 202310595059A CN 116625372 A CN116625372 A CN 116625372A
Authority
CN
China
Prior art keywords
data
sensing
factor graph
sensing data
node
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310595059.9A
Other languages
Chinese (zh)
Inventor
朱琦
施航
任祖杰
胡慧珠
孙沁璇
权思航
缪锐
刘洋
彭风光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Lab
Original Assignee
Zhejiang Lab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Lab filed Critical Zhejiang Lab
Priority to CN202310595059.9A priority Critical patent/CN116625372A/en
Publication of CN116625372A publication Critical patent/CN116625372A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position

Abstract

The specification discloses a multisource unmanned aerial vehicle navigation method, a device, a storage medium and electronic equipment, wherein a factor graph taking each type of sensing data as a node is fitted according to first sensing data of different types collected by sensing equipment and a navigation algorithm for generating navigation information in advance, when the navigation information is generated according to the factor graph, each second sensing data collected by the sensing equipment at present is determined, whether the corresponding sensing equipment fails or not is judged according to each second sensing data, if so, the information of the node corresponding to the second sensing data in the factor graph is adjusted, and the navigation information is generated according to the adjusted factor graph and each second sensing data. When the sensing equipment is judged to be faulty, the method adjusts the nodes corresponding to the sensing data acquired by the sensing equipment in the factor graph by utilizing the plug-and-play characteristic of the factor graph, and regenerates the navigation information, so that the reliability and stability of the navigation system are improved, and the navigation information output by the navigation system is accurate.

Description

Multi-source unmanned aerial vehicle navigation method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of intelligent navigation, and in particular, to a method and apparatus for navigating a multi-source unmanned aerial vehicle, a storage medium, and an electronic device.
Background
Along with the development of intelligent navigation technology, the intelligent navigation technology is applied to various fields, such as automobile unmanned, unmanned aerial vehicle navigation and the like, wherein the current unmanned aerial vehicle navigation is to collect environment information, positioning information and the like of the unmanned aerial vehicle through various sensors and fuse perceived data so as to output navigation information such as the current position, speed and gesture of the unmanned aerial vehicle and navigation paths of the unmanned aerial vehicle. However, due to the fact that the sensor may have a fault condition, accuracy of information collected by the sensor is reduced, and finally output navigation information is inaccurate, so that safety of the unmanned aerial vehicle in an autonomous flight process is reduced, adaptability to complex dynamic environments is poor, and the like.
Based on the above, the present specification provides a multi-source unmanned aerial vehicle navigation method.
Disclosure of Invention
The present disclosure provides a method, an apparatus, a storage medium, and an electronic device for navigating a multi-source unmanned aerial vehicle, so as to partially solve the above-mentioned problems in the prior art.
The technical scheme adopted in the specification is as follows:
The specification provides a multi-source unmanned aerial vehicle navigation method, the multi-source unmanned aerial vehicle includes the perception equipment that is used for gathering each perception data of different grade type, includes:
fitting a factor graph taking each type of sensing data as a node according to different types of first sensing data acquired by the sensing equipment and a navigation algorithm for generating navigation information according to the first sensing data in advance;
when navigation information is generated according to the factor graph, determining second perception data acquired by the perception equipment currently;
judging whether sensing equipment for collecting the second sensing data of each type fails or not according to the second sensing data of the type;
if yes, adjusting the information of the node corresponding to the second sensing data in the factor graph, and generating navigation information according to the adjusted factor graph and each piece of second sensing data.
Optionally, according to the second sensing data of the type, judging whether the sensing device for collecting the second sensing data of the type fails or not, which specifically includes:
determining a fault threshold corresponding to the second perception data of the type;
judging whether the second perception data of the type is larger than the fault threshold value or not;
If yes, the sensing equipment of the second sensing data of the type fails;
if not, the sensing device of the second sensing data of the type does not fail.
Optionally, before generating the navigation information according to the adjusted factor graph and each second perception data, the method further comprises:
judging whether nodes matched with the second perception data exist in the factor graph or not according to the second perception data of each type;
if not, the second perception data is added to the factor graph as a node of the factor graph.
Optionally, adjusting information of a node corresponding to the second sensing data in the factor graph specifically includes:
inputting the second perception data into a pre-trained data correction model, and determining the output correction data of the data correction model;
determining a difference value between the second sensing data and the correction data as error information;
judging whether the data value of the error information is larger than the error threshold value or not according to the error information and a predetermined error threshold value;
if yes, disabling the node corresponding to the second sensing data;
if not, the information of the node corresponding to the second perception data is adjusted according to the correction data.
Optionally, the data modification model includes a long-term memory recurrent neural network model.
Optionally, generating navigation information according to the adjusted factor graph and each second sensing data specifically includes:
determining the length of a pre-generated sliding window;
according to the length of the sliding window, adding a node corresponding to the sliding window into the factor graph;
determining a target node in the factor graph, which is positioned in the sliding window, according to the node corresponding to the sliding window in the factor graph;
and generating navigation information according to the information of the target node.
Optionally, the method further comprises:
determining an endpoint of the multi-source unmanned aerial vehicle;
planning a global flight path according to the terminal point and the current position of the multi-source unmanned aerial vehicle;
and planning a local flight track according to the flight path and the navigation information so as to control the multi-source unmanned aerial vehicle according to the local flight track.
Optionally, training the data correction model specifically includes:
acquiring historical sensing data acquired by sensing equipment as sample sensing data;
determining a true value corresponding to the sample perception data as a label;
inputting the sample perception data into the data correction model to determine correction data corresponding to the sample perception data output by the data correction model;
And training the data correction model according to the correction data and the label.
The present specification provides a multi-source unmanned aerial vehicle navigation device, comprising:
the factor graph fitting module is used for fitting a factor graph taking each type of sensing data as a node in advance according to the first sensing data of different types acquired by the sensing equipment and a navigation algorithm used for generating navigation information according to the first sensing data;
the second perception data determining module is used for determining each piece of second perception data acquired by the current perception equipment when navigation information is generated according to the factor graph;
the judging module is used for judging whether the sensing equipment for collecting the second sensing data of each type fails or not according to the second sensing data of the type;
and the adjusting module is used for adjusting the information of the node corresponding to the second sensing data in the factor graph if so, so as to generate navigation information according to the adjusted factor graph and each piece of second sensing data.
Optionally, the judging module is specifically configured to determine a fault threshold corresponding to the second sensing data of the type; judging whether the second perception data of the type is larger than the fault threshold value or not; if yes, the sensing equipment of the second sensing data of the type fails; if not, the sensing device of the second sensing data of the type does not fail.
Optionally, the apparatus further comprises:
the node matching module is used for judging whether nodes matched with the second sensing data exist in the factor graph or not according to each type of the second sensing data before navigation information is generated according to the adjusted factor graph and each piece of the second sensing data; if not, the second perception data is added to the factor graph as a node of the factor graph.
Optionally, the adjustment module is specifically configured to input the second sensing data into a pre-trained data correction model, and determine correction data of an output of the data correction model; determining a difference value between the second sensing data and the correction data as error information; judging whether the data value of the error information is larger than the error threshold value or not according to the error information and a predetermined error threshold value; if yes, disabling the node corresponding to the second sensing data; if not, the information of the node corresponding to the second perception data is adjusted according to the correction data.
Optionally, the data modification model includes a long-term memory recurrent neural network model.
Optionally, the adjustment module is specifically configured to determine a length of a sliding window that is generated in advance; according to the length of the sliding window, adding a node corresponding to the sliding window into the factor graph; determining a target node in the factor graph, which is positioned in the sliding window, according to the node corresponding to the sliding window in the factor graph; and generating navigation information according to the information of the target node.
Optionally, the apparatus further comprises:
the control module is used for determining the end point of the multi-source unmanned aerial vehicle; planning a global flight path according to the terminal point and the current position of the multi-source unmanned aerial vehicle; and planning a local flight track according to the flight path and the navigation information so as to control the multi-source unmanned aerial vehicle according to the local flight track.
Optionally, the apparatus further comprises:
the training module is used for acquiring historical perception data acquired by the perception equipment and taking the historical perception data as sample perception data; determining a true value corresponding to the sample perception data as a label; inputting the sample perception data into the data correction model to determine correction data corresponding to the sample perception data output by the data correction model; and training the data correction model according to the correction data and the label.
The present specification provides a computer readable storage medium storing a computer program which when executed by a processor implements the above-described multi-source unmanned aerial vehicle navigation method.
The present specification provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the above-described multi-source unmanned aerial vehicle navigation method when executing the program.
The above-mentioned at least one technical scheme that this specification adopted can reach following beneficial effect:
according to the multi-source unmanned aerial vehicle navigation method provided by the specification, first perception data of different types collected by the perception device and a navigation algorithm for generating navigation information according to the first perception data are pre-fitted, each type of perception data is used as a factor graph of a node, when the navigation information is generated according to the factor graph, each second perception data collected by the perception device at present is determined, whether the corresponding perception device fails or not is judged according to each second perception data, if yes, information of the node corresponding to the second perception data in the factor graph is adjusted, and the navigation information is generated according to the adjusted factor graph and each second perception data.
According to the method, the plug-and-play characteristic of the factor graph is utilized, different types of sensing data acquired by different types of sensing devices are used as nodes of the factor graph, if the current sensing device fails, the nodes corresponding to the sensing data acquired by the current sensing device in the factor graph can be adjusted according to the plug-and-play characteristic of the factor graph, and then navigation information is generated according to the adjusted factor graph and each sensing data, so that the reliability and stability of the navigation system are improved, and the navigation information output by the navigation system is accurate.
Drawings
The accompanying drawings, which are included to provide a further understanding of the specification, illustrate and explain the exemplary embodiments of the present specification and their description, are not intended to limit the specification unduly. In the drawings:
fig. 1 is a schematic flow chart of a multi-source unmanned aerial vehicle navigation method provided in the present specification;
FIG. 2 is a schematic diagram of a factor graph provided herein;
FIG. 3 is a schematic diagram of a piecewise linear path provided herein;
fig. 4 is a schematic diagram of a structure of a multi-source unmanned aerial vehicle navigation device provided in the present specification;
fig. 5 is a schematic structural diagram of an electronic device corresponding to fig. 1 provided in the present specification.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the present specification more apparent, the technical solutions of the present specification will be clearly and completely described below with reference to specific embodiments of the present specification and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present specification. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are intended to be within the scope of the present disclosure.
The following describes in detail the technical solutions provided by the embodiments of the present specification with reference to the accompanying drawings.
Fig. 1 is a schematic flow chart of a multi-source unmanned aerial vehicle navigation method provided in the present specification, including the following steps:
s100: and fitting a factor graph taking each type of sensing data as a node according to the first sensing data of different types acquired by the sensing equipment and a navigation algorithm for generating navigation information according to the first sensing data.
In one or more embodiments of the present disclosure, a multi-source unmanned aerial vehicle includes a sensing device for collecting different types of sensing data, where multiple sources, i.e., multiple different types of sensing data, each different type of sensing data may be collected by one or more sensing devices, which is not limited in this disclosure. The sensing device may be various sensors, or may be other devices capable of collecting sensing information, which is not limited in this specification. The execution main body of the instruction can be a controller for navigating the multi-source unmanned aerial vehicle through sensing data acquired by various sensing devices, and can also be other electronic devices capable of navigating the multi-source unmanned aerial vehicle. For convenience of explanation, the multi-source unmanned aerial vehicle navigation method provided in the present specification will be explained below with only the controller as the execution subject.
The multi-source unmanned aerial vehicle is generally equipped with sensing devices such as a global navigation satellite system (Global Navigation Satellite System, GNSS), a vision sensor, an inertial measurement unit (Inertial Measurement Unit, IMU) and the like, so as to navigate the multi-source unmanned aerial vehicle through sensing data collected by each sensing device, but because the multi-source unmanned aerial vehicle may pass through a region with weak satellite signals and a prohibited sensing region, and the sensing devices of the multi-source unmanned aerial vehicle may malfunction, the accuracy of navigation information generated according to each sensing device may be reduced, which may cause the multi-source unmanned aerial vehicle to deviate from a flight path, an accident and the like. Accordingly, the present specification provides a multi-source unmanned aerial vehicle navigation method.
The controller fits a factor graph taking each type of sensing data as a node in advance according to the first sensing data of different types collected by the sensing device and a navigation algorithm for generating navigation information according to the first sensing data.
Fig. 2 is a schematic diagram of a factor graph provided in the present specification. As shown in fig. 2, the nodes corresponding to the first sensing data may include a multi-source unmanned aerial vehicle state node, a visual feature state node, a yaw angle deviation node, an IMU factor, a visual factor, a pseudo range factor, a sliding window length factor, and the like, where the multi-source unmanned aerial vehicle state node includes a position, an attitude, and a speed waiting estimation state quantity of the multi-source unmanned aerial vehicle, and the visual feature state node is a spatial description of a visual feature and represents environmental information contained in a map. Each type of perception data is used as a node because the factor graph has the plug and play characteristic, if the perception equipment corresponding to a certain node fails, the node can be deactivated from the factor graph structure so as to ensure navigation information generated according to the residual perception data Accuracy of (3). That is, since the factor graph is formed by factoring a joint probability function having multiple variables to obtain the product of several local functions, when a sensing device corresponding to a node in the factor graph fails, the node may be disabled. For example, x i For the ith node of the factor graph, such as a visual feature state node, i.e. a node corresponding to the perceived data, y is a joint probability function, i.e. navigation information, and y=x 1 x 2 x 3 x 4 If x is detected 1 If the corresponding sensing device fails, the node can be deactivated, i.e. the original form becomes y=x 2 x 3 x 4
S102: and when navigation information is generated according to the factor graph, determining each piece of second perception data acquired by the current perception equipment.
In one or more embodiments of the present disclosure, the first sensing data and the second sensing data are both sensing data collected by the sensing device, and the second sensing data is different from the first sensing data in time for determining the sensing data, that is, before fitting the factor graph, the first sensing data is determined, and after the factor graph is successfully fitted, the second sensing data is determined when generating navigation information according to the factor graph.
S104: for each type of second sensing data, judging whether sensing equipment for collecting the type of second sensing data fails according to the type of second sensing data, if so, executing the step S106, otherwise, executing the step S110.
Specifically, the controller may input the second sensing data of the type into a pre-trained fault judgment model, and judge whether the sensing device collecting the second sensing data of the type has a fault according to the result output by the fault judgment model. And judging whether the sensing equipment for collecting the second sensing data of the type fails or not according to a preset failure judgment rule and the data of the second sensing data of the type, wherein the method for judging how the sensing equipment fails is not limited in the specification.
S106: and adjusting the information of the node corresponding to the second perception data in the factor graph.
Specifically, the controller inputs the second perception data into a pre-trained data correction model, and determines correction data of the output of the data correction model. The data correction model comprises a long-term memory recurrent neural network model. And then determining the difference value between the second sensing data and the correction data as error information, and judging whether the error information is larger than the error threshold value according to the error information and a predetermined error threshold value. If yes, the node corresponding to the second sensing data is deactivated, that is, the node is isolated, that is, when navigation information is generated according to each node and each second sensing information of the factor graph later, the information of the node corresponding to the second sensing data is not used, and the information of the node comprises the value of the second sensing data corresponding to the node. If not, the information of the node corresponding to the second perception data is adjusted according to the correction data. The correction data may be used as the second sensing data, so as to adjust information of a node corresponding to the second sensing data according to the correction data.
For example, the predetermined error threshold is 1 meter, the second sensing data is 50 meters from the ground of the multi-source unmanned aerial vehicle, the correction data is 60 meters from the ground of the multi-source unmanned aerial vehicle, and the error information is 10 meters. The error information is greater than the error threshold, and the node corresponding to the second sensing data is disabled, however, the node may be deleted, which is not limited in this specification. And if the predetermined error threshold value is 11 meters, taking 60 meters of the multi-source unmanned aerial vehicle from the ground as the second perception data.
It should be noted that, the controller directly disables the node corresponding to the second sensing data and does not use the correction data to adjust the information of the node corresponding to the second sensing data, because if the error information is greater than the error threshold, the accuracy of the correction data is also lower, and the accuracy of the navigation information generated by using the correction data is not high.
S108: and generating navigation information according to the adjusted factor graph and each second perception data.
Specifically, the controller uses all nodes in the adjusted factor graph as target nodes, and generates navigation information according to the target nodes and the second sensing data. And performing product operation on information corresponding to each target node to finally obtain joint probability, namely the probability of state quantities such as the gesture, the position, the speed and the like of the multi-source unmanned aerial vehicle at the next moment.
Due to accumulation of flight path of the multi-source unmanned aerial vehicle and change of flight environment, nodes in the factor graph are gradually increased, sensing data corresponding to nodes which are not needed by the current multi-source unmanned aerial vehicle can be present, in order to improve efficiency of generating navigation information, a controller determines the length of a sliding window which is generated in advance, determines whether a node which is matched with the length of the sliding window exists in the factor graph according to the length of the sliding window, if so, the information of the node is identical with the information of the node, and determines a target node which is positioned in the sliding window in the factor graph according to the node which is corresponding to the sliding window in the factor graph, and generates navigation information according to the information of the target node. If the information of the node is different from the length, the information of the node is corrected according to the length so that the information of the node is the same as the length. If the node does not exist, the node corresponding to the sliding window is added to the factor graph, and then the subsequent step is carried out to generate navigation information according to the information of the target node.
For example, assuming that the length of the sliding window generated in advance is 5, it is determined whether or not there is a node corresponding to the length of the sliding window among the nodes in the factor graph, if there is a node corresponding to the length of the sliding window, it is determined whether or not the information of the corresponding node is 5, if there is a node corresponding to the length of the sliding window generated in advance, 5 required nodes are selected in the factor graph according to the length 5 of the sliding window generated in advance, and if not, the information of the corresponding node is modified to 5. And selecting 5 needed nodes according to the length 5 of the pre-generated sliding window. If the node corresponding to the length of the sliding window does not exist, adding the node corresponding to the sliding window to the factor graph, and then carrying out the subsequent steps.
S110: and generating navigation information according to the factor graph and each second perception data.
Based on the multi-source unmanned aerial vehicle navigation method shown in fig. 1, the method uses the plug-and-play characteristic of the factor graph, takes different types of sensing data collected by different types of sensing devices as nodes of the factor graph, if the current sensing device fails, the nodes corresponding to the sensing data collected by the current sensing device in the factor graph can be adjusted according to the plug-and-play characteristic of the factor graph, and navigation information is generated according to the adjusted factor graph and each sensing data, so that the reliability and stability of the navigation system are improved, and the navigation information output by the navigation system is accurate.
For the factor graph, if a certain multivariate global function F (x 1 ,x 2 ,x 3 ,x 4 ,x 5 ) The partial decomposition can be performed and can be expressed in the form of a correlation function product of a plurality of partial variables, as shown in the following formula:
F(x 1 ,x 2 ,x 3 ,x 4 ,x 5 )=f A (x 1 )f B (x 2 )f C (x 1 ,x 2 ,x 3 )f D (x 3 ,x 4 )f E (x 3 ,x 5 )
the joint probability distribution of such a multivariate global function can be described using a factor graph model. Global function F (x 1 ,x 2 ,x 3 ,x 4 ,x 5 ) Can be decomposed into products of 5 local functions, corresponding to 5 variable nodes x in the factor graph 1 ,x 2 ,x 3 ,x 4 ,x 5 Corresponding to 5 variable nodes in the factor graph, f A ,f B ,f C ,f D ,f E The five local functions correspond to 5 function nodes in the factor graph. The connections between the variable nodes and the function nodes are called edges, and only the variable nodes related to the local function are connected to the corresponding function nodes by the edges.
In step S104, the controller may further determine a fault threshold corresponding to the second sensing data of the type, determine whether the second sensing data of the type is greater than the fault threshold, if yes, the sensing device of the second sensing data of the type fails, and if no, the sensing device of the second sensing data of the type does not fail. The fault threshold may be preset, which is not limited in this specification.
The fault threshold may be an incremental threshold, or may be a fault threshold corresponding to other second sensing data, which is not limited in this specification. Taking inertial navigation systems (Inertial Navigation System, INS) and GNSS as sensing devices, the fault threshold is an increment threshold, and the pseudorange increment for the ith satellite GNSS and INS is defined as:
g represents GNSS, I represents INS, and k and j represent different moments.
pseudo-GNSS pseudorangesThe measurement is as follows:
is a predicted GNSS pseudorange delta, over time the pseudoranges may be expressed as:
the new pseudorange measurements may be expressed as:
the verification fault is determined by applying the following:
that is, the INS may determine the difference between the poses of the multi-source unmanned aerial vehicle from time k to time j, may determine the pseudorange increment from time k to time j according to the difference between the poses, the GNSS may directly measure the pseudorange, predict the predicted pseudorange increment from time k to time j according to the pseudorange from time k measured by the GNSS and the pseudorange from time j, and determine whether the difference between the predicted pseudorange increment and the pseudorange increment is greater than the increment threshold, if so, the INS perceives that the equipment is faulty, otherwise, the INS perceives that the equipment is normal.
For step S108, since the predetermined type of the first sensing data may not include the type of the second sensing data, that is, some sensing devices may not be operated, the adjusted factor graph lacks nodes corresponding to the second sensing data collected by some sensing devices, so that the navigation information generated according to the adjusted factor graph is inaccurate. Therefore, before step S108, the controller determines, for each type of second sensing data, whether there is a node matching the second sensing data in the factor graph, and if not, adds the second sensing data to the factor graph as a node of the factor graph.
After the controller generates the navigation information, when the navigation information is used for controlling the multi-source unmanned aerial vehicle, firstly determining the terminal point of the multi-source unmanned aerial vehicle, planning a global flight path according to the terminal point and the current position of the multi-source unmanned aerial vehicle, and planning a local flight track according to the flight path and the navigation information so as to control the multi-source unmanned aerial vehicle according to the local flight track.
That is, the controller adopts a coarse-to-fine double-layer planning framework, firstly carries out global route planning based on the end point of the multi-source unmanned aerial vehicle, secondly carries out planning of local dynamic tracks based on the current state of the unmanned aerial vehicle and the condition of the surrounding environment, and adjusts the change of the surrounding environment at any time on the basis of ensuring real-time performance.
In global planning, the end point of the multi-source drone is determined, and a jump point search (Jump Point Search, JPS) algorithm is used to find the piecewise linear path that satisfies the condition and has the shortest length. For each segment in the piecewise linear path output by the JPS algorithm, a convex decomposition algorithm is adopted to obtain a convex polygon which belongs to a free traffic area, does not intersect with an obstacle area and completely wraps the current segment, fig. 3 is a schematic diagram of the piecewise linear path provided in the specification, and as shown in fig. 3, the piecewise linear path is fitted in the form of a cubic polynomial curve.
x n (τ)=a n τ 3 +b n τ 2 +c n τ+d n ,τ∈[0,dt]
And solving a polynomial form in the form of a cubic Bezier curve.
Wherein, four control points can be expressed as:
the controller adopts a time-space sequential neural network model to carry out decision layer fusion on each sensing device, s i (k) I=1,..n represents the local decision result of the sensing device i at time k, s fi (k) And the judgment results of all the sensing devices are fused at the moment k, and T (k) is the joint judgment result until the moment k. From the network function point of view, network layer L 1 To L 2 The spatial domain fusion of the decision result of each sensing device is realized, and the connection weight reflects the contribution of each sensor to the spatial domain fusion. Network layer L 2 To L 3 Is a fully connected structure, and the input-output relationship is as follows:
network layer L 2 To L 3 The layer combines the result T (k-1) of the comprehensive decision before the k moment and the result s of the spatial fusion at the k moment fi (k) As evidence, according to the Dempster-Shafer (D-S) evidence theory synthesis rule, updating of the fusion result is completed, and time domain fusion is achieved.
For the step S108, for the other nodes selected by the non-frame, in order to preserve the observation information of the node to improve the adjustment accuracy and reduce the calculation amount, the edge processing may be performed on the other nodes selected by the non-frame. For example, the controller may use a sum-product algorithm to calculate the respective edge probability functions of the joint probabilities. When the factor graph is utilized to solve the edge probability function, the whole solving process can be regarded as a message transmission process, for each child node, the self message is transmitted to an associated parent node, the parent node can continuously transmit the self message to the next associated parent node, different nodes can process the received message in different modes in the transmission process, all transmitted messages can be subjected to product operation in the variable nodes, all transmitted messages and local functions of the factor node can be subjected to product operation and then subjected to complement operation, and finally the root node can complete the calculation of the edge probability function after receiving all transmitted messages.
The function expression of the variable node for transmitting the message to the function node is as follows:
μ h-f (x) For a message passed by variable node x to function node f, n (x) is the set of all sub-function nodes h associated with variable node x, μ h-f (x) A message passed to variable node x for sub-function node h.
The functional expression of the function node to deliver the message to the variable node is as follows:
wherein mu is f-x (x) For a message passed by function node f to variable node x, n (f) is the set of all child variable nodes y associated with the function node, μ y-f (x) A message passed to function node f for child variable node y.
In addition, the present specification also provides a method of training a data correction model.
Specifically, when training the data correction model, the controller firstly acquires historical sensing data acquired by the sensing equipment as sample sensing data. And determining a true value corresponding to the sample perception data as a label. And then, inputting the sample perception data into the data correction model to determine correction data corresponding to the sample perception data output by the data correction model. And finally, training the data correction model according to the correction data and the label. That is, according to the correction data and the label, a difference between the correction data and the label is determined, and according to the difference, the data correction model is trained with the difference being reduced as a training target.
The foregoing is a method implemented by one or more embodiments of the present disclosure, and based on the same concept, the present disclosure further provides a corresponding multi-source unmanned aerial vehicle navigation device, as shown in fig. 4.
Fig. 4 is a schematic diagram of a multi-source unmanned aerial vehicle navigation device provided in the present specification, including:
the factor graph fitting module 400 is configured to fit a factor graph with each type of sensing data as a node according to each different type of first sensing data collected by the sensing device in advance and a navigation algorithm for generating navigation information according to each first sensing data;
a second sensing data determining module 402, configured to determine each piece of second sensing data currently acquired by the sensing device when generating navigation information according to the factor graph;
a judging module 404, configured to judge, for each type of second sensing data, whether a sensing device that collects the type of second sensing data has a fault according to the type of second sensing data;
and the adjusting module 406 is configured to adjust, if yes, information of a node corresponding to the second sensing data in the factor graph, so as to generate navigation information according to the adjusted factor graph and each second sensing data.
Optionally, the determining module 404 is specifically configured to determine a fault threshold corresponding to the second sensing data of the type; judging whether the second perception data of the type is larger than the fault threshold value or not; if yes, the sensing equipment of the second sensing data of the type fails; if not, the sensing device of the second sensing data of the type does not fail.
Optionally, the apparatus further comprises:
the node matching module 408 is configured to determine, for each type of second sensing data, whether a node matching the second sensing data exists in the factor graph before generating navigation information according to the adjusted factor graph and each second sensing data; if not, the second perception data is added to the factor graph as a node of the factor graph.
Optionally, the adjusting module 406 is specifically configured to input the second sensing data into a pre-trained data correction model, and determine correction data of an output of the data correction model; determining a difference value between the second sensing data and the correction data as error information; judging whether the data value of the error information is larger than the error threshold value or not according to the error information and a predetermined error threshold value; if yes, disabling the node corresponding to the second sensing data; if not, the information of the node corresponding to the second perception data is adjusted according to the correction data.
Optionally, the data modification model includes a long-term memory recurrent neural network model.
Optionally, the adjusting module 406 is specifically configured to determine a length of the sliding window that is generated in advance; according to the length of the sliding window, adding a node corresponding to the sliding window into the factor graph; determining a target node in the factor graph, which is positioned in the sliding window, according to the node corresponding to the sliding window in the factor graph; and generating navigation information according to the information of the target node.
Optionally, the apparatus further comprises:
a control module 410, configured to determine an endpoint of the multi-source unmanned aerial vehicle; planning a global flight path according to the terminal point and the current position of the multi-source unmanned aerial vehicle; and planning a local flight track according to the flight path and the navigation information so as to control the multi-source unmanned aerial vehicle according to the local flight track.
Optionally, the apparatus further comprises:
the training module 412 is configured to obtain historical sensing data collected by the sensing device as sample sensing data; determining a true value corresponding to the sample perception data as a label; inputting the sample perception data into the data correction model to determine correction data corresponding to the sample perception data output by the data correction model; and training the data correction model according to the correction data and the label.
The present specification also provides a computer readable storage medium storing a computer program operable to perform a method of multi-source unmanned aerial vehicle navigation as provided in fig. 1 above.
The present specification also provides a schematic structural diagram of the electronic device shown in fig. 5, which corresponds to fig. 1. At the hardware level, as shown in fig. 5, the electronic device includes a processor, an internal bus, a network interface, a memory, and a nonvolatile storage, and may of course include hardware required by other services. The processor reads the corresponding computer program from the nonvolatile memory into the memory and then runs the computer program to realize the multi-source unmanned aerial vehicle navigation method shown in the figure 1.
Of course, other implementations, such as logic devices or combinations of hardware and software, are not excluded from the present description, that is, the execution subject of the following processing flows is not limited to each logic unit, but may be hardware or logic devices.
In the 90 s of the 20 th century, improvements to one technology could clearly be distinguished as improvements in hardware (e.g., improvements to circuit structures such as diodes, transistors, switches, etc.) or software (improvements to the process flow). However, with the development of technology, many improvements of the current method flows can be regarded as direct improvements of hardware circuit structures. Designers almost always obtain corresponding hardware circuit structures by programming improved method flows into hardware circuits. Therefore, an improvement of a method flow cannot be said to be realized by a hardware entity module. For example, a programmable logic device (Programmable Logic Device, PLD) (e.g., field programmable gate array (Field Programmable Gate Array, FPGA)) is an integrated circuit whose logic function is determined by the programming of the device by a user. A designer programs to "integrate" a digital system onto a PLD without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Moreover, nowadays, instead of manually manufacturing integrated circuit chips, such programming is mostly implemented by using "logic compiler" software, which is similar to the software compiler used in program development and writing, and the original code before the compiling is also written in a specific programming language, which is called hardware description language (Hardware Description Language, HDL), but not just one of the hdds, but a plurality of kinds, such as ABEL (Advanced Boolean Expression Language), AHDL (Altera Hardware Description Language), confluence, CUPL (Cornell University Programming Language), HDCal, JHDL (Java Hardware Description Language), lava, lola, myHDL, PALASM, RHDL (Ruby Hardware Description Language), etc., VHDL (Very-High-Speed Integrated Circuit Hardware Description Language) and Verilog are currently most commonly used. It will also be apparent to those skilled in the art that a hardware circuit implementing the logic method flow can be readily obtained by merely slightly programming the method flow into an integrated circuit using several of the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer readable medium storing computer readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, application specific integrated circuits (Application Specific Integrated Circuit, ASIC), programmable logic controllers, and embedded microcontrollers, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, atmel AT91SAM, microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic of the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller in a pure computer readable program code, it is well possible to implement the same functionality by logically programming the method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc. Such a controller may thus be regarded as a kind of hardware component, and means for performing various functions included therein may also be regarded as structures within the hardware component. Or even means for achieving the various functions may be regarded as either software modules implementing the methods or structures within hardware components.
The system, apparatus, module or unit set forth in the above embodiments may be implemented in particular by a computer chip or entity, or by a product having a certain function. One typical implementation is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being functionally divided into various units, respectively. Of course, the functions of each element may be implemented in one or more software and/or hardware elements when implemented in the present specification.
It will be appreciated by those skilled in the art that embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the present specification may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present description can take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The present description is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the specification. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
It will be appreciated by those skilled in the art that embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the present specification may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present description can take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, as relevant to see a section of the description of method embodiments.
The foregoing is merely exemplary of the present disclosure and is not intended to limit the disclosure. Various modifications and alterations to this specification will become apparent to those skilled in the art. Any modifications, equivalent substitutions, improvements, or the like, which are within the spirit and principles of the present description, are intended to be included within the scope of the claims of the present description.

Claims (10)

1. A method of multi-source unmanned aerial vehicle navigation, wherein the multi-source unmanned aerial vehicle comprises a sensing device for collecting different types of respective sensing data, the method comprising:
fitting a factor graph taking each type of sensing data as a node according to different types of first sensing data acquired by the sensing equipment and a navigation algorithm for generating navigation information according to the first sensing data in advance;
When navigation information is generated according to the factor graph, determining second perception data acquired by the perception equipment currently;
judging whether sensing equipment for collecting the second sensing data of each type fails or not according to the second sensing data of the type;
if yes, adjusting the information of the node corresponding to the second sensing data in the factor graph, and generating navigation information according to the adjusted factor graph and each piece of second sensing data.
2. The method of claim 1, wherein determining whether a sensing device collecting the second sensing data of the type is malfunctioning based on the second sensing data of the type, comprises:
determining a fault threshold corresponding to the second perception data of the type;
judging whether the second perception data of the type is larger than the fault threshold value or not;
if yes, the sensing equipment of the second sensing data of the type fails;
if not, the sensing device of the second sensing data of the type does not fail.
3. The method of claim 1, wherein before generating navigation information from the adjusted factor graph and each second sensory data, the method further comprises:
Judging whether nodes matched with the second perception data exist in the factor graph or not according to the second perception data of each type;
if not, the second perception data is added to the factor graph as a node of the factor graph.
4. The method of claim 1, wherein adjusting the information of the node corresponding to the second sensing data in the factor graph specifically comprises:
inputting the second perception data into a pre-trained data correction model, and determining the output correction data of the data correction model;
determining a difference value between the second sensing data and the correction data as error information;
judging whether the data value of the error information is larger than the error threshold value or not according to the error information and a predetermined error threshold value;
if yes, disabling the node corresponding to the second sensing data;
if not, the information of the node corresponding to the second perception data is adjusted according to the correction data.
5. The method of claim 4, wherein the data modification model comprises a long-term memory recurrent neural network model.
6. The method of claim 1, wherein generating navigation information based on the adjusted factor graph and each second sensory data, comprises:
Determining the length of a pre-generated sliding window;
according to the length of the sliding window, adding a node corresponding to the sliding window into the factor graph;
determining a target node in the factor graph, which is positioned in the sliding window, according to the node corresponding to the sliding window in the factor graph;
and generating navigation information according to the information of the target node.
7. The method of claim 1, wherein the method further comprises:
determining an endpoint of the multi-source unmanned aerial vehicle;
planning a global flight path according to the terminal point and the current position of the multi-source unmanned aerial vehicle;
and planning a local flight track according to the flight path and the navigation information so as to control the multi-source unmanned aerial vehicle according to the local flight track.
8. The method of claim 1, wherein training the data correction model comprises:
acquiring historical sensing data acquired by sensing equipment as sample sensing data;
determining a true value corresponding to the sample perception data as a label;
inputting the sample perception data into the data correction model to determine correction data corresponding to the sample perception data output by the data correction model;
And training the data correction model according to the correction data and the label.
9. A computer-readable storage medium, characterized in that the storage medium stores a computer program which, when executed by a processor, implements the method of any of the preceding claims 1-8.
10. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method of any of the preceding claims 1-8 when executing the program.
CN202310595059.9A 2023-05-23 2023-05-23 Multi-source unmanned aerial vehicle navigation method and device, storage medium and electronic equipment Pending CN116625372A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310595059.9A CN116625372A (en) 2023-05-23 2023-05-23 Multi-source unmanned aerial vehicle navigation method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310595059.9A CN116625372A (en) 2023-05-23 2023-05-23 Multi-source unmanned aerial vehicle navigation method and device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN116625372A true CN116625372A (en) 2023-08-22

Family

ID=87637742

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310595059.9A Pending CN116625372A (en) 2023-05-23 2023-05-23 Multi-source unmanned aerial vehicle navigation method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN116625372A (en)

Similar Documents

Publication Publication Date Title
CN111190427B (en) Method and device for planning track
US10962369B2 (en) Fusion of position data by means of pose graph
KR101454824B1 (en) System and Method for estimating positions of an autonomous mobile vehicle
CN111797906B (en) Method and device for positioning based on vision and inertial mileage
CN111077555A (en) Positioning method and device
CN112629550B (en) Method and device for predicting obstacle track and model training
CN113968243B (en) Obstacle track prediction method, device, equipment and storage medium
CN112859131B (en) Positioning method and device of unmanned equipment
WO2023185215A1 (en) Data calibration
CN116625372A (en) Multi-source unmanned aerial vehicle navigation method and device, storage medium and electronic equipment
CN117195974A (en) Training method and device for reserve pool calculation model based on pulse signals
Atia et al. A novel systems integration approach for multi-sensor integrated navigation systems
CN113642616B (en) Training sample generation method and device based on environment data
CN116698019A (en) Multi-sensor-based integrated navigation method and device
CN116300842A (en) Unmanned equipment control method and device, storage medium and electronic equipment
CN112668669B (en) Road friction coefficient estimation method and device and electronic equipment
CN113048989B (en) Positioning method and positioning device of unmanned equipment
CN116518986B (en) Positioning method and device for automatic driving vehicle, electronic equipment and storage medium
CN114115247B (en) Unmanned equipment control method and device, storage medium and electronic equipment
CN112393723B (en) Positioning method, positioning device, medium and unmanned equipment
CN116242385A (en) Visual navigation data calibration method and device
CN113848957B (en) Ground unmanned vehicle formation control device and method
CN117173438B (en) Scene matching method of depth coupling multi-source sensing characteristic
CN116295343A (en) Fusion positioning method and device for automatic driving vehicle and electronic equipment
Zhong et al. Extended H i/H∞-Optimal Fault Detection for INS/GPS Integrated Systems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination