WO2021018113A1 - 用于无人机蜂群协同导航的动态互观测在线建模方法 - Google Patents

用于无人机蜂群协同导航的动态互观测在线建模方法 Download PDF

Info

Publication number
WO2021018113A1
WO2021018113A1 PCT/CN2020/105037 CN2020105037W WO2021018113A1 WO 2021018113 A1 WO2021018113 A1 WO 2021018113A1 CN 2020105037 W CN2020105037 W CN 2020105037W WO 2021018113 A1 WO2021018113 A1 WO 2021018113A1
Authority
WO
WIPO (PCT)
Prior art keywords
available reference
mutual observation
observation
members
mutual
Prior art date
Application number
PCT/CN2020/105037
Other languages
English (en)
French (fr)
Inventor
王融
熊智
刘建业
李荣冰
李传意
杜君南
陈欣
赵耀
崔雨辰
安竞轲
聂庭宇
Original Assignee
南京航空航天大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 南京航空航天大学 filed Critical 南京航空航天大学
Priority to US17/274,445 priority Critical patent/US20210255645A1/en
Publication of WO2021018113A1 publication Critical patent/WO2021018113A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/104Simultaneous control of position or course in three dimensions specially adapted for aircraft involving a plurality of aircrafts, e.g. formation flying
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0004Transmission of traffic-related information to or from an aircraft
    • G08G5/0008Transmission of traffic-related information to or from an aircraft with other aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0017Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
    • G08G5/0021Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located in the aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/003Flight plan management
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0052Navigation or guidance aids for a single aircraft for cruising
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • B64U2201/102UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS] adapted for flying in formations

Definitions

  • the traditional integrated navigation system model is mainly based on the measurement information of the fixed reference coordinate system and the fixed performance, while the relative position and positioning performance of the members of the drone colony are constantly changing during the flight, and the members cooperate in the colony.
  • the role of the assisted object node or the assisted reference node in navigation is also constantly changing.
  • the traditional integrated navigation model cannot meet the needs of drone swarm cooperative navigation.
  • Step 1 Number each member in the drone swarm and denote it as 1, 2, ..., n.
  • the first level Filter to determine the role of each member in the collaborative navigation: set the member whose number of available stars is less than 4 as the object member, and mark the object member number set as A; set the member whose number of available stars is not less than 4 as the candidate reference Members, mark the set of candidate reference member numbers as B; and
  • Step 3 Obtain the position indicated by the airborne navigation system of the candidate reference member j and its positioning error covariance, and convert the position indicated by the airborne navigation system of the candidate reference member j and its positioning error covariance to the object member established in step 2.
  • i In the local northeast sky geographic coordinate system, j represents the member number and j ⁇ B;
  • Step 4 According to whether each object member and each candidate reference member can measure each other, perform a second-level screening of the candidate reference members, and determine the role of each candidate reference member in the collaborative navigation: set with the object The candidate reference member for which member i can measure each other is the available reference member of object member i , and the set of available reference member numbers of object member i is recorded as C i , and
  • Step 5 Calculate the mutual observation vector of the object member and its available reference member, and calculate the vector projection matrix of the object member and its available reference member according to the mutual observation vector;
  • Step 6 Calculate the object position projection matrix of the object member and its available reference members and the available reference position projection matrix
  • Step 7 using the vector projection matrix obtained in step 5 and the object position projection matrix obtained in step 6, to calculate the state mutual observation matrix between the object member and its available reference members;
  • Step 8 Use the vector projection matrix obtained in step 5 and the available reference position projection matrix obtained in step 6 to calculate the noise mutual observation matrix between the object member and its available reference member; use the noise mutual observation matrix to calculate the difference between the object member and its available reference member Inter-observation noise covariance;
  • Step 9 Use the state mutual observation matrix obtained in Step 7 to establish a mutual observation set matrix of all available reference members of the object member;
  • Step 10 Use the mutual observation noise covariance obtained in Step 8 to establish the mutual observation set covariance of the object member for all available reference members;
  • Step 11 Using the mutual observation vector obtained in Step 5, establish the mutual observation set observations of all available reference members of the object member;
  • Step 12 According to the mutual observation set matrix obtained in step 9, the mutual observation set covariance obtained in step 10, and the mutual observation set observation measurement obtained in step 11, a dynamic mutual observation model for drone swarm cooperative navigation is established, and based on the dynamic mutual observation The observation model performs weighted least squares positioning of the object member, obtains the longitude correction, latitude correction, and altitude correction of the object member's position, and calculates the corrected longitude, latitude, and altitude;
  • Step 13 using the state mutual observation matrix obtained in step 7 and the mutual observation noise covariance obtained in step 8 to calculate the object member position estimation covariance;
  • Step 14 Use the object position projection matrix obtained in step 6 and the longitude correction, latitude correction, and height correction of the object member position obtained in step 12 to calculate the online modeling error; when the online modeling error is less than the preset When the dynamic mutual observation online modeling error control standard is used, it is determined that the online modeling iteration is converged, that is, the online modeling ends and the process goes to step 15, otherwise, it returns to step 5 to iteratively correct the mutual observation model;
  • Step 15 Judge whether the navigation is over, and if so, end; otherwise, return to step 1 to perform modeling at the next moment.
  • the mutual observation vector in step 5 is expressed as:
  • ⁇ ik , ⁇ L ik , and ⁇ h ik are the components of the east, north, and sky directions in the northeast sky geographic coordinate system of the target member i, respectively, the difference between the longitude, latitude, and height of the airborne navigation system of the target member i and its available reference member k, R N to earth reference ellipsoid prime vertical radius of curvature
  • f is the earth reference ellipsoid flat rate
  • L i, h i i object members are on-board navigation system output latitude, altitude.
  • the object position projection matrix in step 6 is expressed as:
  • Object position of the projection matrix object members i and its available reference member of k, ⁇ ik, ⁇ L ik are object members i and its available reference member k on-board navigation system outputs the longitude difference between the latitude of, L i, h i are object members i
  • the latitude and altitude output by the airborne navigation system, R N is the radius of curvature of the earth reference ellipsoid 90 unitary circle.
  • the available reference position projection matrix in step 6 is expressed as:
  • L i, h i i object members are on-board navigation system output latitude, altitude, R N to earth reference ellipsoid prime vertical radius of curvature.
  • the state mutual observation matrix in step 7 is expressed as:
  • the noise mutual observation matrix in step 8 is expressed as:
  • the projection matrix is the available reference position of the object member i and its available reference member k.
  • the online modeling error amount in step 14 is expressed as:
  • the present invention adopts the above technical solutions and has the following technical effects:
  • the present invention considers the difference in positioning performance between reference members, and improves the modeling accuracy by integrating the positioning error of the reference member and the measurement error of the ranging sensor and introducing weighted iteration.
  • the present invention is highly flexible and adapts to the mutual observation conditions under different mutual positional relationships and distributions among drone swarms and members of different sizes.
  • Fig. 1 is a flowchart of a dynamic mutual observation online modeling method for drone swarm cooperative navigation of the present invention.
  • Fig. 2 is a graph of iterative modeling of the object member moving coordinate system constructed by the method of the present invention.
  • Fig. 3 is a graph of position error in iterative modeling using the method of the present invention.
  • Fig. 4 is a curve diagram of longitude, latitude, and height errors for iterative modeling using the method of the present invention.
  • the present invention provides a dynamic mutual observation online modeling method for drone swarm collaborative navigation, provides effective support for drone swarm collaborative navigation, and improves the flexibility and accuracy of collaborative navigation modeling.
  • the scheme is shown in Figure 1. Instructions, including the following steps:
  • the members are screened at the first level to determine the role of each member in the coordinated navigation: set the number of available satellites received to be less than The member of 4 is the object member, and the object member number set is marked as A; the member whose number of available stars is not less than 4 is the candidate reference member, and the candidate reference member number set is marked as B; and
  • step (3) Obtain the position indicated by the airborne navigation system of the target member in step (2), and use the indicated position as the origin to establish the local northeast sky geographic coordinate system of the target member; record the indicated position of the target member i airborne navigation system as ( ⁇ i , Li , h i ), the corresponding established local northeast sky coordinate system is expressed as O i XYZ, where ⁇ represents longitude, L represents latitude, and h represents altitude, where i represents member number and i ⁇ A.
  • step (3) Obtain the position indicated by the airborne navigation system of the candidate reference member in step (2) classification and its positioning error covariance, and convert it to the local northeast sky geographic coordinate system of the target member established in step (3); record
  • the position indicated by the airborne navigation system of the reference member j is ( ⁇ j , L j , h j ), where j represents the member number and j ⁇ B.
  • each object member and each candidate reference member can measure each other in turn, perform a second-level screening of candidate reference members to determine the role of each candidate reference member in collaborative navigation: set and The candidate reference member of object member i that can measure each other is the available reference member of object member i , and the set of available reference member numbers of object member i is recorded as C i , and
  • step (6) Use the mutual observation vector of the object member and its available reference member obtained in step (6) to calculate the vector projection matrix; record the vector projection matrix of the object member i and its available reference member k as Its expression is:
  • d ik is the calculated value of the distance between the object member i and its available reference member k, and its expression is
  • d ik is the calculated value of the distance between the object member i and its available reference member k
  • I the distance measurement between the object member i and its available reference member k
  • Step (15) Use the mutual observation set matrix of the object member i obtained in step (13) for all available reference members
  • the covariance of the mutual observation set of the object member i obtained in step (14) to all available reference members is Step (15)
  • the mutual observation set observations of the object member i obtained for all available reference members are Compose the dynamic mutual observation model of the drone swarm cooperative navigation, carry out the weighted least square positioning of the target member, and obtain the longitude correction of the position of the target member i Latitude correction Altitude correction
  • Fig. 1 is a schematic diagram of a dynamic mutual observation modeling method for drone swarm cooperative navigation of the present invention
  • Fig. 2 is a curve diagram of iterative modeling of the object member moving coordinate system constructed by the method of the present invention
  • Fig. 3 is The position error curve diagram of iterative modeling using the method of the present invention
  • Fig. 4 is a longitude, latitude, and height error curve diagram of the iterative modeling using the method of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

一种用于无人机蜂群协同导航的动态互观测在线建模方法:首先根据各成员卫星导航接收机可见星数量对成员进行第一级筛选,明确当前时刻各成员在协同导航中的角色,随后建立以待辅助的各对象成员为原点的移动坐标系,并计算各备选参考节点的坐标;在此基础上,根据与各对象成员的是否可相对测距,对各备选参考节点进行第二级筛选,获得可用参考成员集合,并初步建立动态互观测模型;最后通过迭代修正对模型进行优化,并根据无人机蜂群观测关系、自身定位性能和协同导航中角色的变化进行新一轮动态互观测建模,为有效实现无人机蜂群协同导航提供准确依据。

Description

用于无人机蜂群协同导航的动态互观测在线建模方法 技术领域
本发明涉及用于无人机蜂群协同导航的动态互观测在线建模方法,属于无人机蜂群协同导航技术领域。
背景技术
无人机蜂群是国内外近年来提出的新概念,即多架无人机为适应任务要求而进行的三维空间排列和任务分配的组织模式,它既包括编队飞行的队形产生、保持和重组,也包括飞行任务的组织,可以根据外部情况和任务需求进行动态调整。
传统组合导航系统模型主要是基于固定参考坐标系和固定性能的量测信息,而无人机蜂群在飞行过程中各成员的相对位置和定位性能处于不断变化过程中,各成员在蜂群协同导航中是作为被辅助的对象节点还是辅助的参考节点的角色也不断变化,传统组合导航模型无法适应无人机蜂群协同导航需求。
因此,研究基于移动参考坐标系并考虑成员间观测关系、自身定位性能和协同导航中角色变化的动态互观测模型和建模方法,将能够有效地实现协同导航过程中互观测信息的自适应模型描述,为无人机蜂群发挥自主协同优势提供支持。
发明内容
本发明所要解决的技术问题是:提供用于无人机蜂群协同导航的动态互观测在线建模方法,在移动参考坐标系下考虑成员间观测关系、自身定位性能和协同导航中角色变化,建立动态互观测模型并进行优化,为有效实现协同导航提供准确依据。
本发明为解决上述技术问题采用以下技术方案:
用于无人机蜂群协同导航的动态互观测在线建模方法,包括如下步骤:
步骤1,对无人机蜂群中的每个成员进行编号并表示为1,2,…,n,按照当前时刻各成员机载卫星导航接收机接收到可用星数量,对成员进行第一级筛选,确定各成员在协同导航中的角色:设接收到可用星数量小于4的成员为对象成员,将对象成员编号集合记为A;设接收到可用星数量不小于4的成员为备选参考成员,将备选参考成员编号集合记为B;且
Figure PCTCN2020105037-appb-000001
步骤2,获取对象成员i机载导航系统指示位置,并以该指示位置为原点,建立该对象成员当地东北天地理坐标系,i表示成员编号且i∈A;
步骤3,获取备选参考成员j机载导航系统指示位置及其定位误差协方差,并将备选参考成员j机载导航系统指示位置及其定位误差协方差均转换到步骤2建立的对象成员i当地东北天地理坐标系中,j表示成员编号且j∈B;
步骤4,按照每个对象成员与每个备选参考成员之间是否可以相互测距,对备选参考成员进行第二级筛选,确定各备选参考成员在协同导航中的角色:设与对象成员i可以相互测距的备选参考成员为对象成员i的可用参考成员,将对象成员i的可用参考成员编号集合记为C i,且
Figure PCTCN2020105037-appb-000002
步骤5,计算对象成员与其可用参考成员的互观测矢量,并根据互观测矢量计算对象成员与其可用参考成员的矢量投影矩阵;
步骤6,计算对象成员与其可用参考成员的对象位置投影矩阵以及可用参考位置投影矩阵;
步骤7,利用步骤5获得的矢量投影矩阵和步骤6获得的对象位置投影矩阵,计算对象成员与其可用参考成员之间状态互观测矩阵;
步骤8,利用步骤5获得的矢量投影矩阵和步骤6获得的可用参考位置投影矩阵,计算对象成员与其可用参考成员之间噪声互观测矩阵;利用噪声互观测矩阵,计算对象成员与其可用参考成员之间互观测噪声协方差;
步骤9,利用步骤7获得的状态互观测矩阵,建立对象成员对其全部可用参考成员的互观测集合矩阵;
步骤10,利用步骤8获得的互观测噪声协方差,建立对象成员对其全部可用参考成员的互观测集合协方差;
步骤11,利用步骤5获得的互观测矢量,建立对象成员对其全部可用参考成员的互观测集合观测量;
步骤12,根据步骤9获得的互观测集合矩阵、步骤10获得的互观测集合协方差以及步骤11获得的互观测集合观测量,建立无人机蜂群协同导航的动态互观测模型,根据动态互观测模型进行对象成员加权最小二乘定位,得到对象成员位置的经度修正量、纬度修正量、高度修正量,并计算修正的经度、纬度、高度;
步骤13,利用步骤7获得的状态互观测矩阵和步骤8获得的互观测噪声协方差,计算对象成员位置估计协方差;
步骤14,利用步骤6获得的对象位置投影矩阵和步骤12得到的对象成员位置的经度修正量、纬度修正量、高度修正量,计算在线建模误差量;当在线建模误差量小于事先设置的动态互观测在线建模误差控制标准时,判定在线建模迭代收敛,即在线建模结束并转入步骤15,否则返回步骤5对互观测模型进行迭代修正;
步骤15,判断是否导航结束,如是则结束;否则返回步骤1进行下一时刻建模。
作为本发明的一种优选方案,步骤5所述互观测矢量,表达式为:
Figure PCTCN2020105037-appb-000003
其中,
Figure PCTCN2020105037-appb-000004
为对象成员i与其可用参考成员k的互观测矢量,
Figure PCTCN2020105037-appb-000005
分别为
Figure PCTCN2020105037-appb-000006
在对象成员i当地东北天地理坐标系东、北、天向的分量,Δλ ik、ΔL ik、Δh ik分别为对象成员i与其可用参考成员k机载导航系统输出经度、纬度、高度之差,R N为地球参考椭球卯酉圈曲率半径,f为地球参考椭球扁率,L i、h i分别为对象成员i机载导航系统输出的纬度、高度。
作为本发明的一种优选方案,步骤5所述矢量投影矩阵,表达式为:
Figure PCTCN2020105037-appb-000007
其中,
Figure PCTCN2020105037-appb-000008
为对象成员i与其可用参考成员k的矢量投影矩阵,
Figure PCTCN2020105037-appb-000009
分别为
Figure PCTCN2020105037-appb-000010
在对象成员i当地东北天地理坐标系东、北、天向的分量,
Figure PCTCN2020105037-appb-000011
为对象成员i与其可用参考成员k的互观测矢量,d ik为对象成员i与其可用参考成员k之间的距离计算值,
Figure PCTCN2020105037-appb-000012
作为本发明的一种优选方案,步骤6所述对象位置投影矩阵,表达式为:
Figure PCTCN2020105037-appb-000013
其中,
Figure PCTCN2020105037-appb-000014
为对象成员i与其可用参考成员k的对象位置投影矩阵,Δλ ik、ΔL ik分别为对象成员i与其可用参考成员k机载导航系统输出经度、纬度之差,L i、h i分别为对象成员i机载导航系统输出的纬度、高度,R N为地球参考椭球卯酉圈曲率半径。
作为本发明的一种优选方案,步骤6所述可用参考位置投影矩阵,表达式为:
Figure PCTCN2020105037-appb-000015
其中,
Figure PCTCN2020105037-appb-000016
为对象成员i与其可用参考成员k的可用参考位置投影矩阵,L i、h i分别为对象成员i机载导航系统输出的纬度、高度,R N为地球参考椭球卯酉圈曲率半径。
作为本发明的一种优选方案,步骤7所述状态互观测矩阵,表达式为:
Figure PCTCN2020105037-appb-000017
其中,
Figure PCTCN2020105037-appb-000018
为对象成员i与其可用参考成员k的状态互观测矩阵,
Figure PCTCN2020105037-appb-000019
为对象成员i与其可用参考成员k的矢量投影矩阵,
Figure PCTCN2020105037-appb-000020
为对象成员i与其可用参考成员k的对象位置投影矩阵。
作为本发明的一种优选方案,步骤8所述噪声互观测矩阵,表达式为:
Figure PCTCN2020105037-appb-000021
其中,
Figure PCTCN2020105037-appb-000022
为对象成员i与其可用参考成员k的噪声互观测矩阵,
Figure PCTCN2020105037-appb-000023
为对象成员i与其可用参考成员k的矢量投影矩阵,
Figure PCTCN2020105037-appb-000024
为对象成员i与其可用参考成员k的可用参考位置投影矩阵。
作为本发明的一种优选方案,步骤8所述互观测噪声协方差,表达式为:
Figure PCTCN2020105037-appb-000025
其中,
Figure PCTCN2020105037-appb-000026
为对象成员i与其可用参考成员k的互观测噪声协方差,
Figure PCTCN2020105037-appb-000027
为对象成员i与其可用参考成员k的噪声互观测矩阵,
Figure PCTCN2020105037-appb-000028
表示相对测距传感器的误差协方差,
Figure PCTCN2020105037-appb-000029
表示可用参考成员k定位误差协方差。
作为本发明的一种优选方案,步骤14所述在线建模误差量,表达式为:
Figure PCTCN2020105037-appb-000030
其中,
Figure PCTCN2020105037-appb-000031
为对象成员i与其可用参考成员k的在线建模误差量,
Figure PCTCN2020105037-appb-000032
为对象成员i与其可用参考成员k的对象位置投影矩阵,
Figure PCTCN2020105037-appb-000033
分别为对象成员i位置的经度修正量、纬度修正量、高度修正量。
本发明采用以上技术方案与现有技术相比,具有以下技术效果:
1、本发明考虑了无人机蜂群飞行过程中各成员导航性能的动态变化,通过动态筛选确定各成员在协同导航中的角色,达到优选高定位性能成员辅助低定位性能成员的目的,避免固定角色模式下建模适应性差的问题。
2、本发明考虑了参考成员之间定位性能的差异,通过综合参考成员定位误差和测距传感器测量误差并引入加权迭代,提高了建模精度。
3、本发明灵活性强,适应不同规模的无人机蜂群和成员间不同相互位置关系和分布下的互观测条件。
附图说明
图1是本发明用于无人机蜂群协同导航的动态互观测在线建模方法的流程图。
图2是采用本发明方法构建的对象成员移动坐标系进行迭代建模的曲线图。
图3是采用本发明方法进行迭代建模的位置误差曲线图。
图4是采用本发明方法进行迭代建模的经度、纬度、高度误差曲线图。
具体实施方式
下面详细描述本发明的实施方式,所述实施方式的示例在附图中示出。下面通过参考附图描述的实施方式是示例性的,仅用于解释本发明,而不能解释为对本发明的限制。
本发明提供用于无人机蜂群协同导航的动态互观测在线建模方法,为无人机蜂群协同导航提供有效支持,提高了协同导航建模的灵活性和精度,方案如图1所示,包括以下步骤:
(1)设无人机蜂群中的成员数量为n,对其成员进行编号并表示为1,2,…,n,n为所有成员数量,设置动态互观测在线建模误差控制标准ζ。
(2)按照当前时刻无人机蜂群各成员机载卫星导航接收机接收到可用星数量,对成员进行第一级筛选,确定各成员在协同导航中的角色:设接收到可用星数量小于4的成员为对象成员,将对象成员编号集合记为A;设接收到可用星数量不小于4的成员为备选参考成员,将备选参考成员编号集合记为B;且
Figure PCTCN2020105037-appb-000034
(3)获取步骤(2)分类中对象成员机载导航系统指示位置,并以该指示位置为原点,建立该对象成员当地东北天地理坐标系;记对象成员i机载导航系统指示位置为(λ i,L i,h i),对应建立的当地东北天坐标系表示为O iXYZ,其中λ表示经度,L表示纬度,h表示高度,其中i表示成员编号且i∈A。
(4)获取步骤(2)分类中备选参考成员机载导航系统指示位置及其定位误差协方差,并将其转换到步骤(3)建立的对象成员当地东北天地理坐标系中;记备选参考成员j机 载导航系统指示位置为(λ j,L j,h j),其中j表示成员编号且j∈B。
(5)依次按照每个对象成员与每个备选参考成员之间是否可以相互测距,对备选参考成员进行第二级筛选,确定各备选参考成员在协同导航中的角色:设与对象成员i可以相互测距的备选参考成员为对象成员i的可用参考成员,将对象成员i的可用参考成员编号集合记为C i,且
Figure PCTCN2020105037-appb-000035
(6)计算对象成员与其可用参考成员的互观测矢量,记对象成员i与其可用参考成员k的互观测矢量为
Figure PCTCN2020105037-appb-000036
其表达式为:
Figure PCTCN2020105037-appb-000037
其中i、k表示成员编号且i∈A、k∈C i,Δλ ik为对象成员i与其可用参考成员k机载导航系统输出经度之差,ΔL ik为对象成员i与其可用参考成员k机载导航系统输出纬度之差,Δh ik为对象成员i与其可用参考成员k机载导航系统输出高度之差,R N为地球参考椭球卯酉圈曲率半径,为常数;f为地球参考椭球扁率,为常数;L i为对象成员i机载导航系统输出的纬度,h i为对象成员i机载导航系统输出的高度。
(7)利用步骤(6)获得的对象成员与其可用参考成员的互观测矢量,计算矢量投影矩阵;记对象成员i与其可用参考成员k的矢量投影矩阵为
Figure PCTCN2020105037-appb-000038
其表达式为:
Figure PCTCN2020105037-appb-000039
其中,d ik为对象成员i与其可用参考成员k之间的距离计算值,其表达式为
Figure PCTCN2020105037-appb-000040
(8)计算对象位置投影矩阵;记对象成员i与其可用参考成员k的对象位置投影矩阵为
Figure PCTCN2020105037-appb-000041
其表达式为:
Figure PCTCN2020105037-appb-000042
(9)计算可用参考位置投影矩阵;记对象成员i与其可用参考成员k的可用参考位置投影矩阵为
Figure PCTCN2020105037-appb-000043
其表达式为:
Figure PCTCN2020105037-appb-000044
(10)利用步骤(7)获得的矢量投影矩阵和步骤(8)获得的对象位置投影矩阵,计算对象成员与其可用参考成员之间状态互观测矩阵;记对象成员i与其可用参考成员k的状态互观测矩阵为
Figure PCTCN2020105037-appb-000045
其表达式为:
Figure PCTCN2020105037-appb-000046
(11)利用步骤(7)获得的矢量投影矩阵和步骤(9)获得的可用参考位置投影矩阵,计算对象成员与其可用参考成员之间噪声互观测矩阵;记对象成员i与其可用参考成员k的噪声互观测矩阵为
Figure PCTCN2020105037-appb-000047
其表达式为:
Figure PCTCN2020105037-appb-000048
(12)利用步骤(11)获得的噪声互观测矩阵,计算对象成员与其可用参考成员之间互观测噪声协方差,其表达式为:
Figure PCTCN2020105037-appb-000049
其中
Figure PCTCN2020105037-appb-000050
表示相对测距传感器的误差协方差;
Figure PCTCN2020105037-appb-000051
表示可用参考成员k定位误差协方差。
(13)利用步骤(10)获得对象成员i与其可用参考成员k的状态互观测矩阵为
Figure PCTCN2020105037-appb-000052
建立无人机蜂群各成员的互观测集合矩阵;记对象成员i对其全部可用参考成员的互观测集合矩阵为
Figure PCTCN2020105037-appb-000053
其表达式为:
Figure PCTCN2020105037-appb-000054
Figure PCTCN2020105037-appb-000055
为所有满足k∈C i
Figure PCTCN2020105037-appb-000056
作为行向量组成的矩阵。
(14)利用步骤(12)获得的对象成员与其可用参考成员之间互观测噪声协方差,建立无人机蜂群各成员的互观测集合协方差;记对象成员i对其全部可用参考成员的互观测集合协方差为
Figure PCTCN2020105037-appb-000057
其表达式为:
Figure PCTCN2020105037-appb-000058
其中
Figure PCTCN2020105037-appb-000059
为所有满足k∈C i的作为对角线元素,且非对角线元素为0的矩阵。
(15)利用(6)获得的对象成员与其可用参考成员的互观测矢量,建立无人机蜂群各成员的互观测集合观测量;记对象成员i对其全部可用参考成员的互观测集合观测量为
Figure PCTCN2020105037-appb-000060
其表达式为:
Figure PCTCN2020105037-appb-000061
其中,d ik为对象成员i与其可用参考成员k之间的距离计算值,其表达式为
Figure PCTCN2020105037-appb-000062
Figure PCTCN2020105037-appb-000063
为对象成员i与其可用参考成员k之间的距离测量值。
(16)利用步骤(13)获得的对象成员i对其全部可用参考成员的互观测集合矩阵
Figure PCTCN2020105037-appb-000064
步骤(14)获得的对象成员i对其全部可用参考成员的互观测集合协方差为
Figure PCTCN2020105037-appb-000065
步骤 (15)获得的对象成员i对其全部可用参考成员的互观测集合观测量为
Figure PCTCN2020105037-appb-000066
组成无人机蜂群协同导航的动态互观测模型,进行对象成员加权最小二乘定位,得到对象成员i位置的经度修正量
Figure PCTCN2020105037-appb-000067
纬度修正量
Figure PCTCN2020105037-appb-000068
高度修正量
Figure PCTCN2020105037-appb-000069
(17)利用步骤(16)得到的对象成员i的经度修正量
Figure PCTCN2020105037-appb-000070
纬度修正量
Figure PCTCN2020105037-appb-000071
高度修正量
Figure PCTCN2020105037-appb-000072
计算修正的经度、纬度、高度,其表达式为:
Figure PCTCN2020105037-appb-000073
(18)利用步骤(10)获得的对象成员与其可用参考成员之间状态互观测矩阵,步骤(12)获得的对象成员与其可用参考成员之间互观测噪声协方差,计算对象成员位置估计协方差;记对象成员i位置估计协方差为σ pi,其表达式为:
Figure PCTCN2020105037-appb-000074
(19)利用步骤(8)获得的对象位置投影矩阵和步骤(16)得到的对象成员i的经度修正量
Figure PCTCN2020105037-appb-000075
纬度修正量
Figure PCTCN2020105037-appb-000076
高度修正量
Figure PCTCN2020105037-appb-000077
计算在线建模误差量,其表达式为:
Figure PCTCN2020105037-appb-000078
(20)判断在线建模迭代是否收敛,如
Figure PCTCN2020105037-appb-000079
则判定为收敛,在线建模结束并转入步骤(21);否则返回步骤(6)对互观测模型进行迭代修正。
(21)判断是否导航结束,如是则结束;否则返回步骤(2)进行下一时刻建模。
为了验证本发明所提出的用于动态观测关系条件的无人机蜂群协同导航方法的有效性,进行数字仿真分析。仿真中采用的无人机蜂群中无人机数量为8架,相对距离测量精度为0.1米。图1是本发明用于无人机蜂群协同导航的动态互观测建模方法的方案图;图2是采用本发明方法构建的对象成员移动坐标系进行迭代建模的曲线图;图3是采用本发明方法进行迭代建模的位置误差曲线图;图4是采用本发明方法进行迭代建模的经度、纬度、高度误差曲线图。
由图2可以看出,采用本发明所提出的用于无人机蜂群协同导航的互观测模型与在线建模方法后,无人机蜂群中对象成员的计算位置逐渐初始位置收敛接近于真实位置;由图3可以看出,本发明所提出的用于无人机蜂群协同导航的互观测模型与在线建模方法后对象成员的位置误差逐渐减小,最终计算得到的位置误差较初始误差降低4个数量级;由图3可以看出,本发明所提出的用于无人机蜂群协同导航的互观测模型与在线建模方法后经度、纬度、高度方向误差均逐渐减小。此外,采用本发明方法能够适应无人机蜂群在飞行过程中互观测关系和成员角色的不断变化,具有良好的应用价值。
以上实施例仅为说明本发明的技术思想,不能以此限定本发明的保护范围,凡是按照本发明提出的技术思想,在技术方案基础上所做的任何改动,均落入本发明保护范围之内。

Claims (9)

  1. 用于无人机蜂群协同导航的动态互观测在线建模方法,其特征在于,包括如下步骤:
    步骤1,对无人机蜂群中的每个成员进行编号并表示为1,2,…,n,按照当前时刻各成员机载卫星导航接收机接收到可用星数量,对成员进行第一级筛选,确定各成员在协同导航中的角色:设接收到可用星数量小于4的成员为对象成员,将对象成员编号集合记为A;设接收到可用星数量不小于4的成员为备选参考成员,将备选参考成员编号集合记为B;且
    Figure PCTCN2020105037-appb-100001
    步骤2,获取对象成员i机载导航系统指示位置,并以该指示位置为原点,建立该对象成员当地东北天地理坐标系,i表示成员编号且i∈A;
    步骤3,获取备选参考成员j机载导航系统指示位置及其定位误差协方差,并将备选参考成员j机载导航系统指示位置及其定位误差协方差均转换到步骤2建立的对象成员i当地东北天地理坐标系中,j表示成员编号且j∈B;
    步骤4,按照每个对象成员与每个备选参考成员之间是否可以相互测距,对备选参考成员进行第二级筛选,确定各备选参考成员在协同导航中的角色:设与对象成员i可以相互测距的备选参考成员为对象成员i的可用参考成员,将对象成员i的可用参考成员编号集合记为C i,且
    Figure PCTCN2020105037-appb-100002
    步骤5,计算对象成员与其可用参考成员的互观测矢量,并根据互观测矢量计算对象成员与其可用参考成员的矢量投影矩阵;
    步骤6,计算对象成员与其可用参考成员的对象位置投影矩阵以及可用参考位置投影矩阵;
    步骤7,利用步骤5获得的矢量投影矩阵和步骤6获得的对象位置投影矩阵,计算对象成员与其可用参考成员之间状态互观测矩阵;
    步骤8,利用步骤5获得的矢量投影矩阵和步骤6获得的可用参考位置投影矩阵,计算对象成员与其可用参考成员之间噪声互观测矩阵;利用噪声互观测矩阵,计算对象成员与其可用参考成员之间互观测噪声协方差;
    步骤9,利用步骤7获得的状态互观测矩阵,建立对象成员对其全部可用参考成员的互观测集合矩阵;
    步骤10,利用步骤8获得的互观测噪声协方差,建立对象成员对其全部可用参考成员的互观测集合协方差;
    步骤11,利用步骤5获得的互观测矢量,建立对象成员对其全部可用参考成员的互观测集合观测量;
    步骤12,根据步骤9获得的互观测集合矩阵、步骤10获得的互观测集合协方差以及步骤11获得的互观测集合观测量,建立无人机蜂群协同导航的动态互观测模型,根据动态互观测模型进行对象成员加权最小二乘定位,得到对象成员位置的经度修正量、纬度修正量、高度修正量,并计算修正的经度、纬度、高度;
    步骤13,利用步骤7获得的状态互观测矩阵和步骤8获得的互观测噪声协方差,计算对象成员位置估计协方差;
    步骤14,利用步骤6获得的对象位置投影矩阵和步骤12得到的对象成员位置的经度修正量、纬度修正量、高度修正量,计算在线建模误差量;当在线建模误差量小于事先设置的 动态互观测在线建模误差控制标准时,判定在线建模迭代收敛,即在线建模结束并转入步骤15,否则返回步骤5对互观测模型进行迭代修正;
    步骤15,判断是否导航结束,如是则结束;否则返回步骤1进行下一时刻建模。
  2. 根据权利要求1所述用于无人机蜂群协同导航的动态互观测在线建模方法,其特征在于,步骤5所述互观测矢量,表达式为:
    Figure PCTCN2020105037-appb-100003
    其中,
    Figure PCTCN2020105037-appb-100004
    为对象成员i与其可用参考成员k的互观测矢量,
    Figure PCTCN2020105037-appb-100005
    分别为
    Figure PCTCN2020105037-appb-100006
    在对象成员i当地东北天地理坐标系东、北、天向的分量,Δλ ik、ΔL ik、Δh ik分别为对象成员i与其可用参考成员k机载导航系统输出经度、纬度、高度之差,R N为地球参考椭球卯酉圈曲率半径,f为地球参考椭球扁率,L i、h i分别为对象成员i机载导航系统输出的纬度、高度。
  3. 根据权利要求1所述用于无人机蜂群协同导航的动态互观测在线建模方法,其特征在于,步骤5所述矢量投影矩阵,表达式为:
    Figure PCTCN2020105037-appb-100007
    其中,
    Figure PCTCN2020105037-appb-100008
    为对象成员i与其可用参考成员k的矢量投影矩阵,
    Figure PCTCN2020105037-appb-100009
    分别为
    Figure PCTCN2020105037-appb-100010
    在对象成员i当地东北天地理坐标系东、北、天向的分量,
    Figure PCTCN2020105037-appb-100011
    为对象成员i与其可用参考成员k的互观测矢量,dik为对象成员i与其可用参考成员k之间的距离计算值,
    Figure PCTCN2020105037-appb-100012
  4. 根据权利要求1所述用于无人机蜂群协同导航的动态互观测在线建模方法,其特征在于,步骤6所述对象位置投影矩阵,表达式为:
    Figure PCTCN2020105037-appb-100013
    其中,
    Figure PCTCN2020105037-appb-100014
    为对象成员i与其可用参考成员k的对象位置投影矩阵,Δλ ik、ΔL ik分别为对象成员i与其可用参考成员k机载导航系统输出经度、纬度之差,L i、h i分别为对象成员i机载导航系统输出的纬度、高度,R N为地球参考椭球卯酉圈曲率半径。
  5. 根据权利要求1所述用于无人机蜂群协同导航的动态互观测在线建模方法,其特征在于,步骤6所述可用参考位置投影矩阵,表达式为:
    Figure PCTCN2020105037-appb-100015
    其中,
    Figure PCTCN2020105037-appb-100016
    为对象成员i与其可用参考成员k的可用参考位置投影矩阵,L i、h i分别为对象成员i机载导航系统输出的纬度、高度,R N为地球参考椭球卯酉圈曲率半径。
  6. 根据权利要求1所述用于无人机蜂群协同导航的动态互观测在线建模方法,其特征 在于,步骤7所述状态互观测矩阵,表达式为:
    Figure PCTCN2020105037-appb-100017
    其中,
    Figure PCTCN2020105037-appb-100018
    为对象成员i与其可用参考成员k的状态互观测矩阵,
    Figure PCTCN2020105037-appb-100019
    为对象成员i与其可用参考成员k的矢量投影矩阵,
    Figure PCTCN2020105037-appb-100020
    为对象成员i与其可用参考成员k的对象位置投影矩阵。
  7. 根据权利要求1所述用于无人机蜂群协同导航的动态互观测在线建模方法,其特征在于,步骤8所述噪声互观测矩阵,表达式为:
    Figure PCTCN2020105037-appb-100021
    其中,
    Figure PCTCN2020105037-appb-100022
    为对象成员i与其可用参考成员k的噪声互观测矩阵,
    Figure PCTCN2020105037-appb-100023
    为对象成员i与其可用参考成员k的矢量投影矩阵,
    Figure PCTCN2020105037-appb-100024
    为对象成员i与其可用参考成员k的可用参考位置投影矩阵。
  8. 根据权利要求1所述用于无人机蜂群协同导航的动态互观测在线建模方法,其特征在于,步骤8所述互观测噪声协方差,表达式为:
    Figure PCTCN2020105037-appb-100025
    其中,
    Figure PCTCN2020105037-appb-100026
    为对象成员i与其可用参考成员k的互观测噪声协方差,
    Figure PCTCN2020105037-appb-100027
    为对象成员i与其可用参考成员k的噪声互观测矩阵,
    Figure PCTCN2020105037-appb-100028
    表示相对测距传感器的误差协方差,
    Figure PCTCN2020105037-appb-100029
    表示可用参考成员k定位误差协方差。
  9. 根据权利要求1所述用于无人机蜂群协同导航的动态互观测在线建模方法,其特征在于,步骤14所述在线建模误差量,表达式为:
    Figure PCTCN2020105037-appb-100030
    其中,
    Figure PCTCN2020105037-appb-100031
    为对象成员i与其可用参考成员k的在线建模误差量,
    Figure PCTCN2020105037-appb-100032
    为对象成员i与其可用参考成员k的对象位置投影矩阵,
    Figure PCTCN2020105037-appb-100033
    分别为对象成员i位置的经度修正量、纬度修正量、高度修正量。
PCT/CN2020/105037 2019-07-31 2020-07-28 用于无人机蜂群协同导航的动态互观测在线建模方法 WO2021018113A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/274,445 US20210255645A1 (en) 2019-07-31 2020-07-28 Online modeling method for dynamic mutual observation of drone swarm collaborative navigation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910699294.4A CN110426029B (zh) 2019-07-31 2019-07-31 用于无人机蜂群协同导航的动态互观测在线建模方法
CN201910699294.4 2019-07-31

Publications (1)

Publication Number Publication Date
WO2021018113A1 true WO2021018113A1 (zh) 2021-02-04

Family

ID=68413238

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/105037 WO2021018113A1 (zh) 2019-07-31 2020-07-28 用于无人机蜂群协同导航的动态互观测在线建模方法

Country Status (3)

Country Link
US (1) US20210255645A1 (zh)
CN (1) CN110426029B (zh)
WO (1) WO2021018113A1 (zh)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113670307A (zh) * 2021-07-13 2021-11-19 南京航空航天大学 基于角度混合定位精度因子的无人集群协同导航方法
CN113689501A (zh) * 2021-08-26 2021-11-23 电子科技大学 一种基于收敛点的双机协同目标机定位跟踪控制方法
CN113807591A (zh) * 2021-09-22 2021-12-17 电子科技大学 一种通信距离受限的无人机集群站点协同优化部署方法
CN113804148A (zh) * 2021-08-04 2021-12-17 吉林建筑科技学院 一种基于动态基准的测量平差方法
CN114353800A (zh) * 2021-12-31 2022-04-15 哈尔滨工业大学 一种基于谱图方法的多机器人互定位可观性判别方法及系统
CN114740901A (zh) * 2022-06-13 2022-07-12 深圳联和智慧科技有限公司 一种无人机集群飞行方法、系统及云平台
RU2805431C1 (ru) * 2022-12-30 2023-10-16 Мухамедзянов Равиль Рашидович Самоорганизующийся и самоуправляемый рой БПЛА и способ контроля территории на наличие установленного события посредством такого роя

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110426029B (zh) * 2019-07-31 2022-03-25 南京航空航天大学 用于无人机蜂群协同导航的动态互观测在线建模方法
CN111080258B (zh) * 2019-12-18 2020-11-17 中国人民解放军军事科学院国防科技创新研究院 一种基于角色状态机的群体无人系统协同任务管理子系统
CN111208544B (zh) * 2020-03-04 2022-06-17 南京航空航天大学 一种用于无人机蜂群协同导航的完好性保护水平优化方法
CN111473784B (zh) * 2020-04-16 2023-06-20 南京航空航天大学 基于分布节点信息区块的无人机集群协同导航系统及方法
CN113960639B (zh) * 2021-10-20 2024-05-14 中国电子科技集团公司第二十研究所 基于部署区域迭代分割的导航源部署位置方法
CN114326823B (zh) * 2022-03-16 2023-04-07 北京远度互联科技有限公司 无人机集群的编号方法、装置、电子设备及存储介质
CN115826622B (zh) * 2023-02-13 2023-04-28 西北工业大学 一种无人机群夜间协同定位方法
CN115793717B (zh) * 2023-02-13 2023-05-05 中国科学院自动化研究所 群体协同决策方法、装置、电子设备及存储介质
CN116400715B (zh) * 2023-03-02 2024-06-21 中国人民解放军战略支援部队信息工程大学 模型误差条件下基于CNN+BiLSTM神经网络的多无人机协同直接跟踪方法
CN116358564B (zh) * 2023-06-01 2023-07-28 中国人民解放军战略支援部队航天工程大学 无人机蜂群质心运动状态跟踪方法、系统、设备及介质
CN118102225B (zh) * 2024-04-23 2024-07-23 四川腾盾科技有限公司 基于分布式相对定位的无人机集群导航与拓扑控制方法
CN118244798B (zh) * 2024-05-30 2024-08-30 四川腾盾科技有限公司 基于测距的无人机集群分布式编队队形自适应控制方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103335646A (zh) * 2013-06-20 2013-10-02 哈尔滨工程大学 一种基于分散式增广信息滤波的多艇协同导航方法
CN106482736A (zh) * 2016-07-11 2017-03-08 安徽工程大学 一种基于平方根容积卡尔曼滤波的多机器人协同定位算法
CN108151737A (zh) * 2017-12-19 2018-06-12 南京航空航天大学 一种动态互观测关系条件下的无人机蜂群协同导航方法
US20180308226A1 (en) * 2017-04-24 2018-10-25 Korea Aerospace Research Institute Apparatus and Method For Image Navigation and Registration of Geostationary Remote Sensing Satellites
CN109708629A (zh) * 2018-11-15 2019-05-03 南京航空航天大学 一种用于差异定位性能条件的飞行器集群协同导航方法
CN110426029A (zh) * 2019-07-31 2019-11-08 南京航空航天大学 用于无人机蜂群协同导航的动态互观测在线建模方法

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7218240B2 (en) * 2004-08-10 2007-05-15 The Boeing Company Synthetically generated sound cues

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103335646A (zh) * 2013-06-20 2013-10-02 哈尔滨工程大学 一种基于分散式增广信息滤波的多艇协同导航方法
CN106482736A (zh) * 2016-07-11 2017-03-08 安徽工程大学 一种基于平方根容积卡尔曼滤波的多机器人协同定位算法
US20180308226A1 (en) * 2017-04-24 2018-10-25 Korea Aerospace Research Institute Apparatus and Method For Image Navigation and Registration of Geostationary Remote Sensing Satellites
CN108151737A (zh) * 2017-12-19 2018-06-12 南京航空航天大学 一种动态互观测关系条件下的无人机蜂群协同导航方法
CN109708629A (zh) * 2018-11-15 2019-05-03 南京航空航天大学 一种用于差异定位性能条件的飞行器集群协同导航方法
CN110426029A (zh) * 2019-07-31 2019-11-08 南京航空航天大学 用于无人机蜂群协同导航的动态互观测在线建模方法

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113670307A (zh) * 2021-07-13 2021-11-19 南京航空航天大学 基于角度混合定位精度因子的无人集群协同导航方法
CN113670307B (zh) * 2021-07-13 2024-02-13 南京航空航天大学 基于角度混合定位精度因子的无人集群协同导航方法
CN113804148A (zh) * 2021-08-04 2021-12-17 吉林建筑科技学院 一种基于动态基准的测量平差方法
CN113804148B (zh) * 2021-08-04 2024-04-19 吉林建筑科技学院 一种基于动态基准的测量平差方法
CN113689501A (zh) * 2021-08-26 2021-11-23 电子科技大学 一种基于收敛点的双机协同目标机定位跟踪控制方法
CN113807591A (zh) * 2021-09-22 2021-12-17 电子科技大学 一种通信距离受限的无人机集群站点协同优化部署方法
CN113807591B (zh) * 2021-09-22 2023-04-07 电子科技大学 一种通信距离受限的无人机集群站点协同优化部署方法
CN114353800A (zh) * 2021-12-31 2022-04-15 哈尔滨工业大学 一种基于谱图方法的多机器人互定位可观性判别方法及系统
CN114353800B (zh) * 2021-12-31 2023-10-24 哈尔滨工业大学 一种基于谱图方法的多机器人互定位可观性判别方法及系统
CN114740901A (zh) * 2022-06-13 2022-07-12 深圳联和智慧科技有限公司 一种无人机集群飞行方法、系统及云平台
CN114740901B (zh) * 2022-06-13 2022-08-19 深圳联和智慧科技有限公司 一种无人机集群飞行方法、系统及云平台
RU2805431C1 (ru) * 2022-12-30 2023-10-16 Мухамедзянов Равиль Рашидович Самоорганизующийся и самоуправляемый рой БПЛА и способ контроля территории на наличие установленного события посредством такого роя

Also Published As

Publication number Publication date
CN110426029B (zh) 2022-03-25
CN110426029A (zh) 2019-11-08
US20210255645A1 (en) 2021-08-19

Similar Documents

Publication Publication Date Title
WO2021018113A1 (zh) 用于无人机蜂群协同导航的动态互观测在线建模方法
CN106871927B (zh) 一种无人机光电吊舱安装误差标校方法
CN108387227B (zh) 机载分布式pos的多节点信息融合方法及系统
CN107192375B (zh) 一种基于航拍姿态的无人机多帧图像自适应定位校正方法
CN105222788A (zh) 基于特征匹配的飞行器航偏移误差的自校正方法
CN106940181B (zh) 一种无人机影像像控分布网构建与航片可选范围匹配方法
CN112698664B (zh) 一种用于无人机集群协同导航优化的视线扇区动态估计方法
CN110046563B (zh) 一种基于无人机点云的输电线路断面高程修正方法
CN109858137B (zh) 一种基于可学习扩展卡尔曼滤波的复杂机动飞行器航迹估计方法
CN111156986B (zh) 一种基于抗差自适应ukf的光谱红移自主组合导航方法
CN105180963A (zh) 基于在线标校的无人机遥测参数修正方法
CN102508260A (zh) 一种面向侧视中分辨率卫星的几何成像构建方法
CN109708629A (zh) 一种用于差异定位性能条件的飞行器集群协同导航方法
CN112146650A (zh) 一种用于无人蜂群协同导航的构型优化方法
WO2022193106A1 (zh) 一种通过惯性测量参数将gps与激光雷达融合定位的方法
CN115683141A (zh) 一种未知环境下用于自动驾驶的局部参考路径生成方法
CN109856616B (zh) 一种雷达定位相对系统误差修正方法
CN103344946A (zh) 一种地基雷达与空中移动平台雷达的实时误差配准方法
CN117455960B (zh) 时变观测噪声条件下机载光电系统对地无源定位滤波方法
CN113514829A (zh) 面向InSAR的初始DSM的区域网平差方法
CN113252038A (zh) 基于粒子群算法的航迹规划地形辅助导航方法
CN113310489A (zh) 一种基于互观测虚拟参考域的无人集群协同导航优化方法
CN110411449B (zh) 一种航空侦察载荷目标定位方法、系统及终端设备
CN117156382A (zh) 一种异空域分布的飞行器集群协同导航优化方法
CN108489483B (zh) 一种船载星光定向仪单星次优修正算法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20846716

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20846716

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20846716

Country of ref document: EP

Kind code of ref document: A1