CN114241349A - Multi-unmanned-boat collaborative identification method and device - Google Patents

Multi-unmanned-boat collaborative identification method and device Download PDF

Info

Publication number
CN114241349A
CN114241349A CN202111297322.3A CN202111297322A CN114241349A CN 114241349 A CN114241349 A CN 114241349A CN 202111297322 A CN202111297322 A CN 202111297322A CN 114241349 A CN114241349 A CN 114241349A
Authority
CN
China
Prior art keywords
meta
action
target
element action
unmanned
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111297322.3A
Other languages
Chinese (zh)
Inventor
韩玮
郭晓晔
谢杨柳
曾江峰
陈骁
张馗
王千一
马向峰
陈卓
王伟
梁旭
董钉
李哲
胥凤驰
骆福宇
王一帆
刘如磊
王子帅
吴与伦
宋胜男
董洁琳
许埔宁
王伟蒙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CSSC Systems Engineering Research Institute
Original Assignee
CSSC Systems Engineering Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CSSC Systems Engineering Research Institute filed Critical CSSC Systems Engineering Research Institute
Priority to CN202111297322.3A priority Critical patent/CN114241349A/en
Publication of CN114241349A publication Critical patent/CN114241349A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/933Radar or analogous systems specially adapted for specific applications for anti-collision purposes of aircraft or spacecraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Theoretical Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention provides a multi-unmanned-boat collaborative identification method and a device, wherein the method comprises the following steps: performing element action decomposition on the detection task of a single unmanned ship to obtain an element action set containing element action decomposition results; and determining a collaborative identification task among the unmanned boats based on the meta-action decomposition result in the meta-action set. The unmanned ship identification method is based on the idea of element action decomposition, and decomposes an unmanned ship identification process into a combination of a plurality of element actions to form an element action set based on unmanned ship capability. Each element action has the characteristics of high cohesion and loose coupling, can be independently optimized and continuously promoted, and is used as a basis for designing a multi-unmanned-boat collaborative identification process. The method can flexibly modify the collaborative identification process, adapts to more application scenes by increasing the meta-action, can effectively reduce the difficulty of subsequent continuous optimization debugging, and has good expansibility.

Description

Multi-unmanned-boat collaborative identification method and device
Technical Field
The invention relates to the technical field of computers, in particular to a method and a device for cooperatively identifying multiple unmanned boats.
Background
The unmanned ship is widely applied in the military and civil fields, can replace the manned ship to execute tasks such as reconnaissance, meteorological detection, hydrological measurement and the like, and has new use scenes in the future. The unmanned ship is used for preparing, identifying and sensing the surrounding environment and the target, and is the basis for the unmanned ship to autonomously execute various tasks.
With the change of software and hardware such as detection equipment and algorithm strategies which can be carried by the unmanned ship and the gradual development of a single-ship to a multi-ship cooperative work mode, how to design a cooperative identification method which is strong in expansibility and can be continuously optimized and improved is of great importance to the technical development and the practical application of the unmanned ship.
Disclosure of Invention
The present invention is directed to solving at least one of the problems of the prior art. Therefore, the invention provides a method and a device for cooperatively identifying multiple unmanned boats.
Specifically, the invention provides the following technical scheme:
in a first aspect, an embodiment of the present invention provides a multi-unmanned-boat collaborative identification method, including:
performing element action decomposition on the detection task of a single unmanned ship to obtain an element action set containing element action decomposition results; the meta-action set comprises a plurality of meta-actions, and the meta-actions are meta-actions which need to be executed at different stages of the detection task respectively;
and determining a collaborative identification task among the unmanned boats based on the meta-action decomposition result in the meta-action set.
Further, the method for decomposing the meta-motion of the detection task of the single unmanned ship to obtain a meta-motion set including a meta-motion decomposition result includes:
performing element action decomposition on the detection task of a single unmanned ship according to each stage of the detection process to obtain an element action decomposition result as follows: the method comprises the following steps of navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action and target sequencing element action;
and taking the navigation radar data processing meta-motion, the photoelectric data processing meta-motion, the image recognition meta-motion, the multi-source data fusion meta-motion and the target sorting meta-motion in the meta-motion decomposition result as a meta-motion set.
Further, determining a collaborative recognition task among a plurality of unmanned boats based on the meta-action decomposition results in the meta-action set, including:
based on meta-action decomposition results in the meta-action set: the method comprises the following steps of navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action and target sequencing element action, and determining element action included in a collaborative identification task among a plurality of unmanned boats as: the method comprises the following steps of navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action, target sequencing element action and target distribution element action;
the target allocation element action is to allocate the targets to be identified among the unmanned boats according to a preset task strategy.
Further, the navigation radar data processing meta-action comprises: acquiring initial information of targets around an unmanned ship route, wherein the initial information comprises speed and direction information;
the optoelectronic data processing element acts include: acquiring stable image or video information of a target;
the image recognition meta-action comprises: and identifying the target according to the image or video information to obtain the type and color information of the target.
The multi-source data fusion meta-action comprises: information of various sources such as navigation radar, photoelectricity, image recognition and the like is correlated and updated;
the target ordering meta-action includes: and sorting the importance degrees of the targets according to the information of each target to form a target confirmation list.
Further, the detection task of a single unmanned ship is decomposed according to the element actions in each stage of the detection process, and the obtained element action decomposition result is as follows: navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action and target sequencing element action, including:
controlling the single unmanned boat to execute the following processing procedures according to the element action decomposition result: controlling a single unmanned ship to sail to a task area according to a pre-planned scheme, detecting the periphery according to the pre-planned scheme, acquiring the initial information of a peripheral target based on a navigation radar, acquiring stable image or video information of the target by utilizing photoelectric equipment, identifying the target by utilizing image identification and analysis equipment according to the image or video information, obtaining an identification result containing the type and color information of the target, and forming a target list containing the initial information and the identification result;
after the target list sent by each unmanned ship is obtained, determining the target acquisition list of each unmanned ship according to the target value of the identification result in the target list of each unmanned ship, and sending the target acquisition list of each unmanned ship to the corresponding unmanned ship.
Further, based on the meta-action decomposition results in the meta-action set: the method comprises the following steps of navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action and target sequencing element action, and determining element action included in a collaborative identification task among a plurality of unmanned boats as: navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action, target sequencing element action and target distribution element action, and the method comprises the following steps:
controlling the single unmanned boat to execute the following processing procedures according to the element action decomposition result: controlling a single unmanned ship to sail to a task area according to a pre-planned scheme, detecting the periphery according to the pre-planned scheme, acquiring the initial information of a peripheral target based on a navigation radar, acquiring stable image or video information of the target by utilizing photoelectric equipment, identifying the target by utilizing image identification and analysis equipment according to the image or video information, obtaining an identification result containing the type and color information of the target, and forming a target list containing the initial information and the identification result;
after a target list sent by each unmanned ship is obtained, a total target acquisition list which needs to be cooperatively acquired by a plurality of unmanned ships is determined according to the target value of the identification result in the target list of each unmanned ship, and the targets which need to be identified in the total target acquisition list are distributed among the unmanned ships according to a preset task strategy.
In a second aspect, an embodiment of the present invention further provides a multi-unmanned-boat collaborative identification apparatus, including:
the unit action decomposition module is used for performing unit action decomposition on the detection task of the single unmanned ship to obtain a unit action set containing a unit action decomposition result; the meta-action set comprises a plurality of meta-actions, and the meta-actions are meta-actions which need to be executed at different stages of the detection task respectively;
and the cooperative identification module is used for determining a cooperative identification task among the unmanned boats based on the meta-action decomposition result in the meta-action set.
Further, the meta-action decomposition module is specifically configured to:
performing element action decomposition on the detection task of a single unmanned ship according to each stage of the detection process to obtain an element action decomposition result as follows: the method comprises the following steps of navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action and target sequencing element action;
and taking the navigation radar data processing meta-motion, the photoelectric data processing meta-motion, the image recognition meta-motion, the multi-source data fusion meta-motion and the target sorting meta-motion in the meta-motion decomposition result as a meta-motion set.
In a third aspect, an embodiment of the present invention further provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the multi-unmanned-vessel joint identification method according to the first aspect when executing the program.
In a fourth aspect, the present invention further provides a non-transitory computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the multi-unmanned-boat collaborative recognition method according to the first aspect.
According to the technical scheme, the method and the device for the collaborative identification of the multiple unmanned ships, provided by the embodiment of the invention, are characterized in that a meta-motion decomposition is carried out on a detection task of a single unmanned ship to obtain a meta-motion set containing a meta-motion decomposition result, and the collaborative identification tasks among the multiple unmanned ships are determined based on the meta-motion decomposition result in the meta-motion set, so that the concept of the meta-motion decomposition is realized, the unmanned ship identification process is decomposed into a combination of a plurality of meta-motions, and a meta-motion set based on the unmanned ship capability is formed. Each element action has the characteristics of high cohesion and loose coupling, can be independently optimized and continuously promoted, and is used as a basis for designing a multi-unmanned-boat collaborative identification process. The method can flexibly modify the collaborative identification process, adapts to more application scenes by increasing the meta-action, can effectively reduce the difficulty of subsequent continuous optimization debugging, and has good expansibility.
It is to be understood that additional aspects and advantages of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a flowchart of a method for collaborative identification of multiple drones according to an embodiment of the present invention;
FIG. 2 is a flow diagram for single-boat target validation provided by an embodiment of the present invention;
fig. 3 is a flowchart of a multi-unmanned-vessel cooperative identification provided in an embodiment of the present invention;
fig. 4 is a timing diagram illustrating a cooperative identification of multiple drones according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a multi-unmanned-boat cooperative identification apparatus according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The unmanned ship is widely applied in the military and civil fields, can replace the manned ship to execute tasks such as reconnaissance, meteorological detection, hydrological measurement and the like, and has new use scenes in the future. The unmanned ship is used for preparing, identifying and sensing the surrounding environment and the target, and is the basis for the unmanned ship to autonomously execute various tasks.
With the change of software and hardware such as detection equipment and algorithm strategies which can be carried by the unmanned ship and the gradual development of a single-ship to a multi-ship cooperative work mode, how to design a cooperative identification method which is strong in expansibility and can be continuously optimized and improved is of great importance to the technical development and the practical application of the unmanned ship.
The unmanned ship identification method is based on the idea of element action decomposition, and decomposes an unmanned ship identification process into a combination of a plurality of element actions to form an element action set based on unmanned ship capability. Each element action has the characteristics of high cohesion and loose coupling, can be independently optimized and continuously promoted, and is used as a basis for designing a multi-unmanned-boat collaborative identification process. The method can flexibly modify the collaborative identification process, adapts to more application scenes by increasing the meta-action, can effectively reduce the difficulty of subsequent continuous optimization debugging, and has good expansibility. The multi-unmanned-boat collaborative identification method provided by the invention will be described in detail through specific embodiments.
Fig. 1 shows a flowchart of a multi-unmanned-vessel cooperative identification method according to an embodiment of the present invention, and referring to fig. 1, the multi-unmanned-vessel cooperative identification method according to the embodiment of the present invention includes:
step 101: performing element action decomposition on the detection task of a single unmanned ship to obtain an element action set containing element action decomposition results; the meta-action set comprises a plurality of meta-actions, and the meta-actions are meta-actions which need to be executed at different stages of the detection task respectively;
step 102: and determining a collaborative identification task among the unmanned boats based on the meta-action decomposition result in the meta-action set.
In this embodiment, a typical flow for a single boat to perform a probe-type task is shown in fig. 2. The process mainly comprises the following main steps: (1) planning a task; (2) navigation control; (3) target confirmation; (4) the task terminates. The actions of the unmanned boat cell that can be decomposed for the target confirmation process are described in table 1 below.
TABLE 1
Figure BDA0003336992890000061
Based on the meta-action decomposition result, the main process of the multi-unmanned-boat collaborative recognition task and the interaction among the meta-actions are designed, as shown in fig. 3 and 4.
The meta-actions involved in the above design process are shown in table 2 below.
TABLE 2
Figure BDA0003336992890000062
Figure BDA0003336992890000071
The target allocation is the new added element action aiming at the multi-unmanned-boat collaborative recognition scene, and the rest are element actions obtained by decomposition according to the single-boat recognition process, so that the flexibility and the expansibility of the scheme can be fully explained.
According to the technical scheme, the multi-unmanned-boat collaborative identification method provided by the embodiment of the invention obtains the meta-action set containing the meta-action decomposition result by performing the meta-action decomposition on the detection task of a single unmanned boat, and determines the collaborative identification tasks among the plurality of unmanned boats based on the meta-action decomposition result in the meta-action set, so that the concept of decomposing the meta-action is realized, the unmanned boat identification process is decomposed into a combination of a plurality of meta-actions, and the meta-action set based on the unmanned boat capability is formed. Each element action has the characteristics of high cohesion and loose coupling, can be independently optimized and continuously promoted, and is used as a basis for designing a multi-unmanned-boat collaborative identification process. The method can flexibly modify the collaborative identification process, adapts to more application scenes by increasing the meta-action, can effectively reduce the difficulty of subsequent continuous optimization debugging, and has good expansibility.
Based on the content of the foregoing embodiment, in this embodiment, performing meta-motion decomposition on a probe task of a single unmanned ship to obtain a meta-motion set including a result of the meta-motion decomposition includes:
performing element action decomposition on the detection task of a single unmanned ship according to each stage of the detection process to obtain an element action decomposition result as follows: the method comprises the following steps of navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action and target sequencing element action;
and taking the navigation radar data processing meta-motion, the photoelectric data processing meta-motion, the image recognition meta-motion, the multi-source data fusion meta-motion and the target sorting meta-motion in the meta-motion decomposition result as a meta-motion set.
In the embodiment, a meta-motion set containing a meta-motion decomposition result is obtained by performing meta-motion decomposition on a detection task of a single unmanned ship, and a collaborative identification task among a plurality of unmanned ships is determined based on the meta-motion decomposition result in the meta-motion set, so that the concept of decomposing the meta-motion is realized, an unmanned ship identification process is decomposed into a combination of a plurality of meta-motions, and a meta-motion set based on unmanned ship capability is formed.
Based on the content of the foregoing embodiment, in this embodiment, determining a collaborative recognition task among a plurality of unmanned boats based on a meta-action decomposition result in the meta-action set includes:
based on meta-action decomposition results in the meta-action set: the method comprises the following steps of navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action and target sequencing element action, and determining element action included in a collaborative identification task among a plurality of unmanned boats as: the method comprises the following steps of navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action, target sequencing element action and target distribution element action;
the target allocation element action is to allocate the targets to be identified among the unmanned boats according to a preset task strategy.
In this embodiment, the target is allocated as a new meta-action for the collaborative recognition scene of multiple unmanned boats, and the rest are meta-actions obtained by decomposition according to the recognition process of a single boat, so that the flexibility and the expansibility of the scheme can be fully explained.
In the embodiment, a meta-motion set containing a meta-motion decomposition result is obtained by performing meta-motion decomposition on a detection task of a single unmanned ship, and a collaborative identification task among a plurality of unmanned ships is determined based on the meta-motion decomposition result in the meta-motion set, so that the concept of decomposing the meta-motion is realized, an unmanned ship identification process is decomposed into a combination of a plurality of meta-motions, and a meta-motion set based on unmanned ship capability is formed. Each element action has the characteristics of high cohesion and loose coupling, can be independently optimized and continuously promoted, and is used as a basis for designing a multi-unmanned-boat collaborative identification process. The method can flexibly modify the collaborative identification process, adapts to more application scenes by increasing the meta-action, can effectively reduce the difficulty of subsequent continuous optimization debugging, and has good expansibility.
Based on the content of the foregoing embodiment, in this embodiment, the navigation radar data processing meta-action includes: acquiring initial information of targets around an unmanned ship route, wherein the initial information comprises speed and direction information;
the optoelectronic data processing element acts include: acquiring stable image or video information of a target;
the image recognition meta-action comprises: and identifying the target according to the image or video information to obtain the type and color information of the target.
The multi-source data fusion meta-action comprises: information of various sources such as navigation radar, photoelectricity, image recognition and the like is correlated and updated;
the target ordering meta-action includes: and sorting the importance degrees of the targets according to the information of each target to form a target confirmation list.
Based on the content of the foregoing embodiment, in this embodiment, the detection task of a single unmanned ship is decomposed in terms of meta-motion according to each stage of the detection process, and the meta-motion decomposition result is obtained as follows: navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action and target sequencing element action, including:
controlling the single unmanned boat to execute the following processing procedures according to the element action decomposition result: controlling a single unmanned ship to sail to a task area according to a pre-planned scheme, detecting the periphery according to the pre-planned scheme, acquiring the initial information of a peripheral target based on a navigation radar, acquiring stable image or video information of the target by utilizing photoelectric equipment, identifying the target by utilizing image identification and analysis equipment according to the image or video information, obtaining an identification result containing the type and color information of the target, and forming a target list containing the initial information and the identification result;
after the target list sent by each unmanned ship is obtained, determining the target acquisition list of each unmanned ship according to the target value of the identification result in the target list of each unmanned ship, and sending the target acquisition list of each unmanned ship to the corresponding unmanned ship.
Based on the content of the above embodiment, in the present embodiment, based on the meta-action decomposition result in the meta-action set: the method comprises the following steps of navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action and target sequencing element action, and determining element action included in a collaborative identification task among a plurality of unmanned boats as: navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action, target sequencing element action and target distribution element action, and the method comprises the following steps:
controlling the single unmanned boat to execute the following processing procedures according to the element action decomposition result: controlling a single unmanned ship to sail to a task area according to a pre-planned scheme, detecting the periphery according to the pre-planned scheme, acquiring the initial information of a peripheral target based on a navigation radar, acquiring stable image or video information of the target by utilizing photoelectric equipment, identifying the target by utilizing image identification and analysis equipment according to the image or video information, obtaining an identification result containing the type and color information of the target, and forming a target list containing the initial information and the identification result;
after a target list sent by each unmanned ship is obtained, a total target acquisition list which needs to be cooperatively acquired by a plurality of unmanned ships is determined according to the target value of the identification result in the target list of each unmanned ship, and the targets which need to be identified in the total target acquisition list are distributed among the unmanned ships according to a preset task strategy.
In this embodiment, the preset task policy may be a policy determined according to a matching degree between the level of the target value of the recognition result and the level of the processing capability of the unmanned ship, or a policy determined when the unmanned ship is close to a small distance in a direction that needs to be changed to recognize the assigned target. Or may be a strategy determined by combining both of the above two points.
In the embodiment, a meta-motion set containing a meta-motion decomposition result is obtained by performing meta-motion decomposition on a detection task of a single unmanned ship, and a collaborative identification task among a plurality of unmanned ships is determined based on the meta-motion decomposition result in the meta-motion set, so that the concept of decomposing the meta-motion is realized, an unmanned ship identification process is decomposed into a combination of a plurality of meta-motions, and a meta-motion set based on unmanned ship capability is formed. Each element action has the characteristics of high cohesion and loose coupling, can be independently optimized and continuously promoted, and is used as a basis for designing a multi-unmanned-boat collaborative identification process. The method can flexibly modify the collaborative identification process, adapts to more application scenes by increasing the meta-action, can effectively reduce the difficulty of subsequent continuous optimization debugging, and has good expansibility.
As can be seen from the above description, the process involved in this embodiment mainly includes the following main steps:
(1) mission planning
(2) Navigation control
(3) Object validation
(4) Task termination
The unmanned boat element action which can be decomposed for the target confirmation process is as follows.
And (3) navigation radar data processing: and acquiring initial information of targets around the unmanned ship route, wherein the initial information comprises information such as speed and direction.
Photoelectric data processing: stable image or video information of the object is acquired.
Image recognition: and identifying the target according to the image or video information to obtain information such as the type, the color and the like of the target.
Multi-source data fusion: and information of various sources such as navigation radar, photoelectricity, image recognition and the like is associated and updated.
Target sorting: and sorting the importance degrees of the targets according to the information of each target to form a target confirmation list. The sorting can be performed manually by an operator or autonomously according to a task strategy.
Based on the meta-action decomposition result, the main process of the multi-unmanned-boat collaborative recognition task and the interaction among the meta-actions are designed, as shown in fig. 3 and 4.
In the above design process, the meta-actions involved are as follows.
And (3) navigation radar data processing: and acquiring initial information of targets around the unmanned ship route, wherein the initial information comprises information such as speed and direction.
Photoelectric data processing: stable image or video information of the object is acquired.
Image recognition: and identifying the target according to the image or video information to obtain information such as the type, the color and the like of the target.
Multi-source data fusion: and information of various sources such as navigation radar, photoelectricity, image recognition and the like is associated and updated.
Target sorting: and sorting the importance degrees of the targets according to the information of each target to form a target confirmation list. The sorting can be performed manually by an operator or autonomously according to a task strategy.
Target allocation: and distributing the targets to be identified among the unmanned ships according to the set task strategy.
The target is distributed to newly-increased element actions aiming at a multi-unmanned-boat collaborative recognition scene, and the rest are element actions obtained by decomposition according to a single-boat recognition process, so that the flexibility and the expansibility of the scheme can be fully explained.
According to the technical scheme, the multi-unmanned-boat collaborative identification method provided by the embodiment of the invention obtains the meta-action set containing the meta-action decomposition result by performing the meta-action decomposition on the detection task of a single unmanned boat, and determines the collaborative identification tasks among the plurality of unmanned boats based on the meta-action decomposition result in the meta-action set, so that the concept of decomposing the meta-action is realized, the unmanned boat identification process is decomposed into a combination of a plurality of meta-actions, and the meta-action set based on the unmanned boat capability is formed. Each element action has the characteristics of high cohesion and loose coupling, can be independently optimized and continuously promoted, and is used as a basis for designing a multi-unmanned-boat collaborative identification process. The method can flexibly modify the collaborative identification process, adapts to more application scenes by increasing the meta-action, can effectively reduce the difficulty of subsequent continuous optimization debugging, and has good expansibility.
Based on the same inventive concept, another embodiment of the present invention provides a multi-unmanned-boat collaborative recognition apparatus, referring to fig. 5, the multi-unmanned-boat collaborative recognition apparatus provided by this embodiment includes: a meta-action decomposition module 21 and a collaborative recognition module 22, wherein:
the meta-motion decomposition module 21 is used for performing meta-motion decomposition on the detection task of a single unmanned ship to obtain a meta-motion set containing a meta-motion decomposition result; the meta-action set comprises a plurality of meta-actions, and the meta-actions are meta-actions which need to be executed at different stages of the detection task respectively;
and the cooperative identification module 22 is used for determining a cooperative identification task among the unmanned boats based on the meta-action decomposition result in the meta-action set.
Based on the content of the foregoing embodiment, in this embodiment, the meta-action decomposition module 21 is specifically configured to:
performing element action decomposition on the detection task of a single unmanned ship according to each stage of the detection process to obtain an element action decomposition result as follows: the method comprises the following steps of navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action and target sequencing element action;
and taking the navigation radar data processing meta-motion, the photoelectric data processing meta-motion, the image recognition meta-motion, the multi-source data fusion meta-motion and the target sorting meta-motion in the meta-motion decomposition result as a meta-motion set.
Based on the content of the foregoing embodiment, in this embodiment, the cooperative identification module 22 is specifically configured to:
based on meta-action decomposition results in the meta-action set: the method comprises the following steps of navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action and target sequencing element action, and determining element action included in a collaborative identification task among a plurality of unmanned boats as: the method comprises the following steps of navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action, target sequencing element action and target distribution element action;
the target allocation element action is to allocate the targets to be identified among the unmanned boats according to a preset task strategy.
Based on the content of the foregoing embodiment, in this embodiment, the navigation radar data processing meta-action includes: acquiring initial information of targets around an unmanned ship route, wherein the initial information comprises speed and direction information;
the optoelectronic data processing element acts include: acquiring stable image or video information of a target;
the image recognition meta-action comprises: and identifying the target according to the image or video information to obtain the type and color information of the target.
The multi-source data fusion meta-action comprises: information of various sources such as navigation radar, photoelectricity, image recognition and the like is correlated and updated;
the target ordering meta-action includes: and sorting the importance degrees of the targets according to the information of each target to form a target confirmation list.
Based on the content of the foregoing embodiment, in this embodiment, the meta-motion decomposition module 21 performs meta-motion decomposition on the detection task of a single unmanned ship according to each stage of the detection process, and obtains a meta-motion decomposition result as: when the navigation radar data processing element acts, the photoelectric data processing element acts, the image recognition element acts, the multi-source data fusion element acts and the target sequencing element acts, the method is specifically used for:
controlling the single unmanned boat to execute the following processing procedures according to the element action decomposition result: controlling a single unmanned ship to sail to a task area according to a pre-planned scheme, detecting the periphery according to the pre-planned scheme, acquiring the initial information of a peripheral target based on a navigation radar, acquiring stable image or video information of the target by utilizing photoelectric equipment, identifying the target by utilizing image identification and analysis equipment according to the image or video information, obtaining an identification result containing the type and color information of the target, and forming a target list containing the initial information and the identification result;
after the target list sent by each unmanned ship is obtained, determining the target acquisition list of each unmanned ship according to the target value of the identification result in the target list of each unmanned ship, and sending the target acquisition list of each unmanned ship to the corresponding unmanned ship.
Based on the content of the foregoing embodiment, in this embodiment, the cooperation identifying module 22, based on the meta-action decomposition result in the meta-action set: the method comprises the following steps of navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action and target sequencing element action, and determining element action included in a collaborative identification task among a plurality of unmanned boats as: the navigation radar data processing element action, the photoelectric data processing element action, the image identification element action, the multi-source data fusion element action, the target sequencing element action and the target distribution element action are specifically used for:
controlling the single unmanned boat to execute the following processing procedures according to the element action decomposition result: controlling a single unmanned ship to sail to a task area according to a pre-planned scheme, detecting the periphery according to the pre-planned scheme, acquiring the initial information of a peripheral target based on a navigation radar, acquiring stable image or video information of the target by utilizing photoelectric equipment, identifying the target by utilizing image identification and analysis equipment according to the image or video information, obtaining an identification result containing the type and color information of the target, and forming a target list containing the initial information and the identification result;
after a target list sent by each unmanned ship is obtained, a total target acquisition list which needs to be cooperatively acquired by a plurality of unmanned ships is determined according to the target value of the identification result in the target list of each unmanned ship, and the targets which need to be identified in the total target acquisition list are distributed among the unmanned ships according to a preset task strategy.
Since the multi-unmanned-vessel cooperative identification device provided by the embodiment can be used for executing the multi-unmanned-vessel cooperative identification method described in the above embodiment, and the working principle and the beneficial effect are similar, detailed descriptions are omitted here, and specific contents can be referred to the description of the above embodiment.
Based on the same inventive concept, another embodiment of the present invention provides an electronic device, which specifically includes the following components, with reference to fig. 6: a processor 301, a memory 302, a communication interface 303, and a communication bus 304;
the processor 301, the memory 302 and the communication interface 303 complete mutual communication through the communication bus 304; the communication interface 303 is used for realizing transmission between related devices such as modeling software, an intelligent manufacturing equipment module library and the like;
the processor 301 is configured to call a computer program in the memory 302, and the processor implements all the steps of the above-mentioned multi-unmanned-boat collaborative recognition method when executing the computer program, for example, the processor implements the following steps when executing the computer program: performing element action decomposition on the detection task of a single unmanned ship to obtain an element action set containing element action decomposition results; the meta-action set comprises a plurality of meta-actions, and the meta-actions are meta-actions which need to be executed at different stages of the detection task respectively; and determining a collaborative identification task among the unmanned boats based on the meta-action decomposition result in the meta-action set.
Based on the same inventive concept, a further embodiment of the present invention provides a non-transitory computer-readable storage medium, having a computer program stored thereon, which when executed by a processor implements all the steps of the above-mentioned multi-unmanned boat collaborative recognition method, for example, the processor implements the following steps when executing the computer program: performing element action decomposition on the detection task of a single unmanned ship to obtain an element action set containing element action decomposition results; the meta-action set comprises a plurality of meta-actions, and the meta-actions are meta-actions which need to be executed at different stages of the detection task respectively; and determining a collaborative identification task among the unmanned boats based on the meta-action decomposition result in the meta-action set.
In addition, the logic instructions in the memory may be implemented in the form of software functional units and may be stored in a computer readable storage medium when sold or used as a stand-alone product. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. Based on such understanding, the technical solutions mentioned above may be embodied in the form of a software product, which may be stored in a computer-readable storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the multi-airship cooperation recognition method according to the various embodiments or some parts of the embodiments.
In the present invention, terms such as "first", "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Moreover, in the present invention, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Furthermore, in the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
The above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A multi-unmanned-boat collaborative identification method is characterized by comprising the following steps:
performing element action decomposition on the detection task of a single unmanned ship to obtain an element action set containing element action decomposition results; the meta-action set comprises a plurality of meta-actions, and the meta-actions are meta-actions which need to be executed at different stages of the detection task respectively;
and determining a collaborative identification task among the unmanned boats based on the meta-action decomposition result in the meta-action set.
2. The method for cooperatively identifying multiple unmanned boats according to claim 1, wherein performing meta-motion decomposition on a detection task of a single unmanned boat to obtain a meta-motion set including a meta-motion decomposition result comprises:
performing element action decomposition on the detection task of a single unmanned ship according to each stage of the detection process to obtain an element action decomposition result as follows: the method comprises the following steps of navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action and target sequencing element action;
and taking the navigation radar data processing meta-motion, the photoelectric data processing meta-motion, the image recognition meta-motion, the multi-source data fusion meta-motion and the target sorting meta-motion in the meta-motion decomposition result as a meta-motion set.
3. The multi-unmanned-boat collaborative recognition method according to claim 2, wherein determining collaborative recognition tasks among a plurality of unmanned boats based on meta-action decomposition results in the meta-action set comprises:
based on meta-action decomposition results in the meta-action set: the method comprises the following steps of navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action and target sequencing element action, and determining element action included in a collaborative identification task among a plurality of unmanned boats as: the method comprises the following steps of navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action, target sequencing element action and target distribution element action;
the target allocation element action is to allocate the targets to be identified among the unmanned boats according to a preset task strategy.
4. The method for collaborative recognition of multiple drones according to claim 3, wherein the navigation radar data processing meta-action comprises: acquiring initial information of targets around an unmanned ship route, wherein the initial information comprises speed and direction information;
the optoelectronic data processing element acts include: acquiring stable image or video information of a target;
the image recognition meta-action comprises: and identifying the target according to the image or video information to obtain the type and color information of the target.
The multi-source data fusion meta-action comprises: information of various sources such as navigation radar, photoelectricity, image recognition and the like is correlated and updated;
the target ordering meta-action includes: and sorting the importance degrees of the targets according to the information of each target to form a target confirmation list.
5. The cooperative multi-unmanned-boat identification method according to claim 4, wherein the detection task of a single unmanned boat is subjected to meta-motion decomposition according to each stage of the detection process, and the obtained meta-motion decomposition result is: navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action and target sequencing element action, including:
controlling the single unmanned boat to execute the following processing procedures according to the element action decomposition result: controlling a single unmanned ship to sail to a task area according to a pre-planned scheme, detecting the periphery according to the pre-planned scheme, acquiring the initial information of a peripheral target based on a navigation radar, acquiring stable image or video information of the target by utilizing photoelectric equipment, identifying the target by utilizing image identification and analysis equipment according to the image or video information, obtaining an identification result containing the type and color information of the target, and forming a target list containing the initial information and the identification result;
after the target list sent by each unmanned ship is obtained, determining the target acquisition list of each unmanned ship according to the target value of the identification result in the target list of each unmanned ship, and sending the target acquisition list of each unmanned ship to the corresponding unmanned ship.
6. The multi-unmanned-boat collaborative recognition method according to claim 4, wherein based on meta-action decomposition results in the meta-action set: the method comprises the following steps of navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action and target sequencing element action, and determining element action included in a collaborative identification task among a plurality of unmanned boats as: navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action, target sequencing element action and target distribution element action, and the method comprises the following steps:
controlling the single unmanned boat to execute the following processing procedures according to the element action decomposition result: controlling a single unmanned ship to sail to a task area according to a pre-planned scheme, detecting the periphery according to the pre-planned scheme, acquiring the initial information of a peripheral target based on a navigation radar, acquiring stable image or video information of the target by utilizing photoelectric equipment, identifying the target by utilizing image identification and analysis equipment according to the image or video information, obtaining an identification result containing the type and color information of the target, and forming a target list containing the initial information and the identification result;
after a target list sent by each unmanned ship is obtained, a total target acquisition list which needs to be cooperatively acquired by a plurality of unmanned ships is determined according to the target value of the identification result in the target list of each unmanned ship, and the targets which need to be identified in the total target acquisition list are distributed among the unmanned ships according to a preset task strategy.
7. A multi-unmanned-boat collaborative recognition device is characterized by comprising:
the unit action decomposition module is used for performing unit action decomposition on the detection task of the single unmanned ship to obtain a unit action set containing a unit action decomposition result; the meta-action set comprises a plurality of meta-actions, and the meta-actions are meta-actions which need to be executed at different stages of the detection task respectively;
and the cooperative identification module is used for determining a cooperative identification task among the unmanned boats based on the meta-action decomposition result in the meta-action set.
8. The device for collaborative recognition of multiple drones according to claim 7, wherein the meta-action decomposition module is specifically configured to:
performing element action decomposition on the detection task of a single unmanned ship according to each stage of the detection process to obtain an element action decomposition result as follows: the method comprises the following steps of navigation radar data processing element action, photoelectric data processing element action, image identification element action, multi-source data fusion element action and target sequencing element action;
and taking the navigation radar data processing meta-motion, the photoelectric data processing meta-motion, the image recognition meta-motion, the multi-source data fusion meta-motion and the target sorting meta-motion in the meta-motion decomposition result as a meta-motion set.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program performs the steps of the method of cooperative multi-drones identification according to any of claims 1 to 6.
10. A non-transitory computer readable storage medium having stored thereon a computer program, wherein the computer program when executed by a processor implements the steps of the method for collaborative identification of multiple drones according to any of claims 1 to 6.
CN202111297322.3A 2021-11-04 2021-11-04 Multi-unmanned-boat collaborative identification method and device Pending CN114241349A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111297322.3A CN114241349A (en) 2021-11-04 2021-11-04 Multi-unmanned-boat collaborative identification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111297322.3A CN114241349A (en) 2021-11-04 2021-11-04 Multi-unmanned-boat collaborative identification method and device

Publications (1)

Publication Number Publication Date
CN114241349A true CN114241349A (en) 2022-03-25

Family

ID=80743672

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111297322.3A Pending CN114241349A (en) 2021-11-04 2021-11-04 Multi-unmanned-boat collaborative identification method and device

Country Status (1)

Country Link
CN (1) CN114241349A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117590752A (en) * 2024-01-16 2024-02-23 深圳市太控科技有限公司 Multi-axis cooperative control method and system based on action decomposition

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106933232A (en) * 2017-04-27 2017-07-07 上海大学 A kind of context aware systems and method based on collaboration unmanned boat group
CN110097212A (en) * 2019-04-08 2019-08-06 华南理工大学 A kind of unmanned boat high energy efficiency Cooperative Area detection method
CN111399533A (en) * 2020-02-10 2020-07-10 合肥工业大学 Heterogeneous multi-unmanned aerial vehicle cooperative task allocation and path optimization method
CN112182977A (en) * 2020-10-12 2021-01-05 中国人民解放军国防科技大学 Control method and system for cooperative game confrontation of unmanned cluster
CN112270488A (en) * 2020-11-09 2021-01-26 中国电子技术标准化研究院 Unmanned aerial vehicle cluster task allocation method and device and unmanned aerial vehicle cluster system
CN112987737A (en) * 2021-02-26 2021-06-18 华中科技大学 Bi-RRT unmanned ship multi-ship navigation method and equipment considering corner constraint
CN113316118A (en) * 2021-05-31 2021-08-27 中国人民解放军国防科技大学 Unmanned aerial vehicle cluster network self-organizing system and method based on task cognition

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106933232A (en) * 2017-04-27 2017-07-07 上海大学 A kind of context aware systems and method based on collaboration unmanned boat group
CN110097212A (en) * 2019-04-08 2019-08-06 华南理工大学 A kind of unmanned boat high energy efficiency Cooperative Area detection method
CN111399533A (en) * 2020-02-10 2020-07-10 合肥工业大学 Heterogeneous multi-unmanned aerial vehicle cooperative task allocation and path optimization method
CN112182977A (en) * 2020-10-12 2021-01-05 中国人民解放军国防科技大学 Control method and system for cooperative game confrontation of unmanned cluster
CN112270488A (en) * 2020-11-09 2021-01-26 中国电子技术标准化研究院 Unmanned aerial vehicle cluster task allocation method and device and unmanned aerial vehicle cluster system
CN112987737A (en) * 2021-02-26 2021-06-18 华中科技大学 Bi-RRT unmanned ship multi-ship navigation method and equipment considering corner constraint
CN113316118A (en) * 2021-05-31 2021-08-27 中国人民解放军国防科技大学 Unmanned aerial vehicle cluster network self-organizing system and method based on task cognition

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李远等: "基于BDI 智能体的无人作战飞机自主任务管理系统", 《系统仿真学报》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117590752A (en) * 2024-01-16 2024-02-23 深圳市太控科技有限公司 Multi-axis cooperative control method and system based on action decomposition
CN117590752B (en) * 2024-01-16 2024-04-26 深圳市太控科技有限公司 Multi-axis cooperative control method and system based on action decomposition

Similar Documents

Publication Publication Date Title
Grove et al. Digitalization impacts on corporate governance
JP2006059325A (en) Method for managing sensor by use of hierarchical decision making method based on rule
CN108286965A (en) Unmanned plane based on subtle three-dimensional landform gets higher course line method, terminal and system
CN114241349A (en) Multi-unmanned-boat collaborative identification method and device
Jamshidi et al. Advance trends in soft computing
CN112166458A (en) Target detection and tracking method, system, equipment and storage medium
Blasch et al. Review of game theory applications for situation awareness
CN114581464A (en) Boundary detection method and device, electronic equipment and computer readable storage medium
Koubaa et al. Deep learning for unmanned systems
CN111399533B (en) Heterogeneous multi-unmanned aerial vehicle cooperative task allocation and path optimization method
CN112465908A (en) Object positioning method and device, terminal equipment and storage medium
CN111984032B (en) Unmanned plane path planning method and device, electronic equipment and storage medium
CN112256054B (en) Unmanned aerial vehicle trajectory planning method and device
CN113065521B (en) Object identification method, device, equipment and medium
CN114460961A (en) Unmanned equipment operation route determining method, device, equipment and storage medium
CN111833395A (en) Direction-finding system single target positioning method and device based on neural network model
Brady Three-dimensional measurements of animal paths using handheld unconstrained GoPro cameras and VSLAM software
CN104462189B (en) Distributed system data management based on planar bar code technology and operation method
CN111860827B (en) Multi-target positioning method and device of direction-finding system based on neural network model
Müller et al. Software architecture of an isr asset planning application
CN111246369A (en) Online task allocation method for protecting location privacy in mobile group perception
CN117474292B (en) 5G transmission-based scheduling system and scheduling method for network-connected unmanned aerial vehicle
Torres-Amaral et al. The climatic risk of Amazonian protected areas is driven by climate velocity until 2050
CN110618607A (en) Behavior-based multi-UUV self-organizing coordination control method
CN110989653B (en) Rapid information interaction topology generation method and device for cooperative situation awareness of unmanned aerial vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 1 Fengxian East Road, Haidian District, Beijing 100094

Applicant after: China Shipbuilding Corporation System Engineering Research Institute

Address before: 1 Fengxian East Road, Haidian District, Beijing 100094

Applicant before: China Shipbuilding Industry System Engineering Research Institute

CB02 Change of applicant information
RJ01 Rejection of invention patent application after publication

Application publication date: 20220325

RJ01 Rejection of invention patent application after publication