CN116339220A - Method, device and system for cooperatively completing tasks by different unmanned systems - Google Patents

Method, device and system for cooperatively completing tasks by different unmanned systems Download PDF

Info

Publication number
CN116339220A
CN116339220A CN202310617786.0A CN202310617786A CN116339220A CN 116339220 A CN116339220 A CN 116339220A CN 202310617786 A CN202310617786 A CN 202310617786A CN 116339220 A CN116339220 A CN 116339220A
Authority
CN
China
Prior art keywords
unmanned
unmanned systems
task
different
systems
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310617786.0A
Other languages
Chinese (zh)
Other versions
CN116339220B (en
Inventor
徐东方
王亚楠
李瑞民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhongbing Tiangong Defense Technology Co ltd
Original Assignee
Beijing Zhongbing Tiangong Defense Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhongbing Tiangong Defense Technology Co ltd filed Critical Beijing Zhongbing Tiangong Defense Technology Co ltd
Priority to CN202310617786.0A priority Critical patent/CN116339220B/en
Publication of CN116339220A publication Critical patent/CN116339220A/en
Application granted granted Critical
Publication of CN116339220B publication Critical patent/CN116339220B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0423Input/output
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/25Pc structure of the system
    • G05B2219/25257Microcontroller
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application discloses a method, a device and a system for jointly completing tasks in coordination with different unmanned systems, which relate to the technical field of unmanned systems, and are characterized in that platform information and sensor information sent by a plurality of different unmanned systems are received, a global situation is generated according to the platform information and the sensor information, an event is generated according to the global situation and a preset event generation rule, task instructions of all the unmanned systems are generated according to the event, and all the task instructions are sent to corresponding unmanned systems; the unmanned system responds to the task instruction and completes the task after receiving the task instruction; the task instructions comprise global decision task instructions and single-machine decision task instructions. The method, the device and the system for jointly completing the tasks in cooperation with different unmanned systems can complete more complex tasks, and are applicable to more scenes.

Description

Method, device and system for cooperatively completing tasks by different unmanned systems
Technical Field
The application relates to the technical field of unmanned systems, in particular to a method, a device and a system for cooperatively completing tasks by different unmanned systems.
Background
With the continuous development and progress of technology, the market of unmanned systems is very prosperous, and various manufacturers put forward unmanned systems with various functions including unmanned aerial vehicles, unmanned boats, unmanned submarines and the like, and the unmanned systems generally have perfect remote control (unmanned systems are controlled by people through remote controllers) functions and certain program control (unmanned systems are controlled automatically through programs) functions, so that specific tasks are independently completed under the remote control and the program control. For example, unmanned aerial vehicle can accomplish tasks such as aerial monitoring, telemetering measurement, line patrol, and unmanned aerial vehicle can get into disaster sites such as earthquake, conflagration, carries out personnel rescue etc..
Because of the limitations of the self-cruising ability and the carrying load of the unmanned system and the limitations of specific task scenes, a single unmanned system cannot be qualified in many task scenes, some tasks cannot be completed even through a plurality of unmanned systems of the same type, and the unmanned systems of different types are required to be matched with each other so as to meet the task demands.
However, because of the great principle difference between unmanned systems, and the different industries, few unmanned system companies can develop a plurality of different unmanned systems at the same time and realize the cooperative work of the unmanned systems, and the cost for developing a new unmanned system is high; therefore, how to effectively cooperate with the unmanned systems currently developed by different unmanned system manufacturers to meet the task demands of mutually cooperating unmanned systems of different types becomes a problem to be solved by those skilled in the art.
Disclosure of Invention
Therefore, the application provides a method, a device and a system for cooperatively completing tasks by different unmanned systems, so as to solve the problem that the unmanned systems of different types in the prior art cannot cooperate with each other to complete tasks together.
In order to achieve the above object, the present application provides the following technical solutions:
in a first aspect, a method for co-completing tasks in conjunction with different unmanned systems, includes:
receiving platform information and sensor information sent by a plurality of different unmanned systems, and generating a global situation according to the platform information and the sensor information;
generating an event according to the global situation and a preset event generation rule;
generating task instructions of each unmanned system according to the events, and sending each task instruction to the corresponding unmanned system; the unmanned system responds to the task instruction and completes the task after receiving the task instruction; the task instructions comprise global decision task instructions and single-machine decision task instructions.
Preferably, a global situation is generated according to the platform information and the sensor information, specifically:
carrying out filtering processing on the platform information and the sensor information by adopting a Kalman filtering algorithm;
and carrying out time alignment on the platform information and the sensor information after the filtering processing, and fusing the platform information and the sensor information by adopting a k-means algorithm and a Bayesian network to obtain a global situation.
Preferably, the global decision task instruction is generated according to a system task target and the global situation; the single-machine decision task instruction is generated according to the global decision task instruction and the self state of the unmanned system.
Preferably, the event includes an event type, a generation time, an event coordinate, and an event object.
In a second aspect, an apparatus for co-completing tasks in conjunction with different unmanned systems, comprises:
the situation information processing module is used for receiving platform information and sensor information sent by a plurality of different unmanned systems and generating a global situation according to the platform information and the sensor information;
the event generation module is used for generating an event according to the global situation and a preset event generation rule;
the unmanned system state machine module is used for generating task instructions of all unmanned systems according to the events, sending all the task instructions to corresponding unmanned systems, and responding to the task instructions and completing tasks after the unmanned systems receive the task instructions; the task instructions comprise global decision task instructions and single-machine decision task instructions.
Preferably, a plurality of unmanned system state machine modules are provided, and each unmanned system state machine module corresponds to one unmanned system.
In a third aspect, a system for co-completing tasks with different unmanned systems includes the device for co-completing tasks with different unmanned systems, a first communication device and a plurality of unmanned systems, where the first communication device is electrically connected with the device for co-completing tasks with different unmanned systems, and each unmanned system is configured with a separate second communication device for establishing communication connection with the device for co-completing tasks with different unmanned systems via the first communication device, respectively.
In a fourth aspect, a system for co-operating with different unmanned systems to complete tasks includes a device for co-operating with different unmanned systems to complete tasks, a network switch, and a plurality of unmanned systems, where the device for co-operating with different unmanned systems to complete tasks is communicatively connected to a ground control station of the unmanned systems through the network switch.
In a fifth aspect, a computer device comprises a memory storing a computer program and a processor implementing steps of a method for co-completing tasks in conjunction with different unmanned systems when the computer program is executed by the processor.
In a sixth aspect, a computer readable storage medium has stored thereon a computer program which, when executed by a processor, performs the steps of a method for co-completing tasks in conjunction with different unmanned systems.
Compared with the prior art, the application has the following beneficial effects:
the application provides a method, a device and a system for jointly completing tasks in cooperation with different unmanned systems, which are characterized in that platform information and sensor information sent by a plurality of different unmanned systems are received, a global situation is generated according to the platform information and the sensor information, an event is generated according to the global situation and a preset event generation rule, task instructions of each unmanned system are generated according to the event, and each task instruction is sent to the corresponding unmanned system; the unmanned system responds to the task instruction and completes the task after receiving the task instruction; the task instructions comprise global decision task instructions and single-machine decision task instructions. The method, the device and the system for jointly completing the tasks in cooperation with different unmanned systems can complete more complex tasks, are applicable to more scenes, can fully utilize the current abundant unmanned system platforms, and achieve efficiency improvement under the current technical level.
Drawings
For a more visual illustration of the prior art and the present application, several exemplary drawings are presented below. It should be understood that the specific shape and configuration shown in the drawings should not be considered in general as limiting upon the practice of the present application; for example, based on the technical concepts and exemplary drawings disclosed herein, those skilled in the art have the ability to easily make conventional adjustments or further optimizations for the add/subtract/assign division, specific shapes, positional relationships, connection modes, dimensional scaling relationships, etc. of certain units (components).
FIG. 1 is a flowchart of a method for co-completing tasks in conjunction with different unmanned systems according to a first embodiment of the present application;
fig. 2 is a schematic structural diagram of a device for co-completing tasks in cooperation with different unmanned systems according to a second embodiment of the present application;
fig. 3 is a schematic diagram of a system structure for co-completing tasks in cooperation with different unmanned systems according to a third embodiment of the present application;
fig. 4 is a schematic diagram of a system structure for cooperatively completing tasks in conjunction with different unmanned systems according to a fourth embodiment of the present application.
Detailed Description
The present application is further described in detail below with reference to the attached drawings.
In the description of the present application: unless otherwise indicated, the meaning of "a plurality" is two or more. The terms "first," "second," "third," and the like in this application are intended to distinguish between the referenced objects without a special meaning in terms of technical connotation (e.g., should not be construed as emphasis on degree or order of importance, etc.). The expressions "comprising", "including", "having", etc. also mean "not limited to" (certain units, components, materials, steps, etc.).
The terms such as "upper", "lower", "left", "right", "middle", and the like, as referred to in this application, are generally used for convenience in visual understanding with reference to the drawings, and are not intended to be an absolute limitation of the positional relationship in actual products. Such changes in relative positional relationship are considered to be within the scope of the present description without departing from the technical concepts disclosed herein.
Example 1
Referring to fig. 1, the present embodiment provides a method for co-completing tasks in conjunction with different unmanned systems, including:
s1: receiving platform information and sensor information sent by a plurality of different unmanned systems, and generating a global situation according to the platform information and the sensor information;
specifically, the global situation is that a Kalman filtering algorithm is adopted to carry out filtering processing on platform information and sensor information sent by a plurality of unmanned systems, then time alignment is carried out on the platform information and the sensor information provided by each unmanned system, and a k-means algorithm and a Bayesian network are adopted to carry out data fusion. The k-means algorithm clusters similar data to form fusion data, the Bayesian network is trained based on historical data, and confidence of the fusion data is evaluated, so that fusion accuracy is improved.
For example: under the offshore united patrol scene, receiving platform information such as positions, speeds and residual fuels of the unmanned aerial vehicle and the unmanned ship and sensor observation information such as pod video streams, target identification data and radar identification data, and summarizing and fusing the information to generate a unified global situation containing the unmanned systems and the sensor observation data thereof, wherein the global situation is mainly various types of ships on the water surface and comprises information such as types, positions, speeds and intentions of the ships.
In a real environment, when unmanned systems scattered in various places observe the environment through own load, observation errors are difficult to avoid, and consistent understanding of the environment is difficult to realize under a distributed system.
For example: under the marine joint patrol scene, unmanned aerial vehicle and unmanned ship carry out the patrol task in the marine in coordination, monitor and catch, fishing, activities such as smuggling, unmanned aerial vehicle is at the air through photoelectric nacelle discovery surface of water target, unmanned aerial vehicle surveys surface of water target through photoelectricity and radar, because the surface of water exists characteristics such as spray, reflection of light, can lead to unmanned aerial vehicle and unmanned ship to discover a large amount of suspected targets, leads to appearing great error to the judgement of target. By fusing the characteristics and space-time characteristics of the targets found by the unmanned aerial vehicle and the unmanned ship, errors can be effectively removed, and the reliability of the global situation can be improved.
S2: generating an event according to the global situation and a preset event generation rule; the event element mainly comprises event type, generation time, event coordinates, event objects and the like, and the event is a key basis for decision making based on a state machine technology.
In order to reduce the load of operators in the unmanned cluster, the unmanned cluster can simultaneously consider the work of a plurality of unmanned systems, only serves as a decision maker instead of a controller, and adds event generation, and in normal cases, the operators only need to pay attention to the event, but do not pay attention to all data reported by the unmanned systems. It should be noted that, the operator exists as a monitor and a final decision maker, and the operator can affirm or negate the decision reported by the unmanned system, thereby affecting the decision process.
An event refers to a special event generated by processing observation data that needs to be noticed by people, for example, by processing a large number of objects observed and reported by unmanned aerial vehicles and unmanned boats, and finding a fishing boat that is likely to be in illegal operation, and at this time, people should be reminded of the fishing boat, so an event is generated to remind people using the system.
The step is further illustrated by the "offshore joint patrol scenario": under the marine joint patrol scene, the event mainly is to find and identify a specific target, for example, comprehensively analyze the size of a ship and the set characteristics of a ship clamping plate, judge that the target is a fishing ship with high probability, and the event of finding an illegal fishing ship is generated when a mission sea area is still in a fishing holiday.
In the offshore united patrol scene, the data reported by the unmanned system mainly comprises self state data (such as position, speed, fuel oil and the like) and observation data (such as nacelle video stream, target identification data, radar identification data and the like); decision-making refers to deciding what matters are done according to the reported data, for example deciding to make the unmanned aerial vehicle go to take a close-range photograph of a suspected fish-taking illegally for evidence collection; the control means that an operator directly remotely controls the unmanned system, for example, the operator controls the aircraft to fly close to the fishing boat through the controller and take a picture.
S3: generating task instructions of each unmanned system according to the event, and sending each task instruction to the corresponding unmanned system; the unmanned system responds to the task instruction and completes the task after receiving the task instruction; the task instructions comprise global decision task instructions and single-machine decision task instructions.
The task instruction refers to the smallest instruction unit which can be autonomously executed by the unmanned aerial vehicle system, for example, a camera for taking a picture of XX and evidence, and the unmanned aerial vehicle can execute actions such as approaching flight, photographing and the like according to the instruction.
The global decision task instruction is generated according to a system task target and a global situation, for example, a system task is patrol of a certain area, the global decision module generates a patrol task area according to patrol conditions of the current area, and a plurality of unmanned systems are appointed to patrol the task area within a certain time range; the single-machine decision task instruction is generated according to the global decision task instruction and the self state of the unmanned platform, for example, a patrol task distributed by the global decision is received, and the self instruction is generated according to the load capacity, the motion characteristics and the like of the self platform.
At present, an unmanned system generally only has certain program control capability, can autonomously execute some simple tasks, improves the program control capability under the condition of not changing the software structure of the unmanned system, and can improve the autonomous speech degree through the step.
The method for jointly completing the tasks in cooperation with different unmanned systems can complete more complex tasks, is applicable to more scenes, has obvious cost and time advantages compared with the development of a new unmanned system, and can fully utilize the current abundant unmanned system platforms in China to achieve efficiency improvement under the current technical level.
Example two
Referring to fig. 2, the present embodiment provides an apparatus for co-completing tasks in conjunction with different unmanned systems (abbreviated as a cooperative control apparatus), which includes:
the situation information processing module is used for receiving platform information and sensor information sent by a plurality of different unmanned systems and generating a global situation according to the platform information and the sensor information;
specifically, the situation information processing module is used for collecting platform information and sensor information of each unmanned system, and summarizing and fusing data to form a unified global situation containing observation data of each unmanned system and sensors thereof.
In a real environment, when an unmanned system dispersed in each place observes the environment through own load, observation errors are difficult to avoid, the consistency understanding of the environment is difficult to realize under a distributed system, a situation information processing module is introduced, and the situation consistency of each system is realized through comprehensive management and fusion of a central node.
The event generation module is used for generating events according to the global situation and a preset event generation rule;
specifically, the event generation module can customize event generation rules according to different tasks, and generates events according to global situation and task rules in the process of executing tasks, wherein event elements mainly comprise event types, generation time, event coordinates, event objects and the like, and the events are key bases for decision making based on state machine technology.
In order to reduce the load of people in the unmanned cluster, the embodiment can simultaneously consider the work of a plurality of unmanned systems, only serves as a decision maker of the system, but not a controller, and is added into the event generation module, so that an operator only needs to pay attention to the event under normal conditions, but does not need to pay attention to all data reported by the unmanned systems.
The unmanned system state machine module is used for generating task instructions of all unmanned systems according to the events, sending all the task instructions to the corresponding unmanned systems, and responding the task instructions and completing the tasks after the unmanned systems receive the task instructions; the task instructions comprise global decision task instructions and single-machine decision task instructions.
Specifically, the unmanned system state machine modules are in one-to-one correspondence with the unmanned systems and are responsible for making decisions and generating unmanned system task instructions according to events, namely a decision center for independently executing tasks by a single unmanned platform and a decision center for cooperatively executing tasks by a plurality of unmanned platforms.
The state machine is divided into collaborative decision-making and single-machine decision-making, in the collaborative decision-making portion, all the unmanned systems are identical, and in the single-machine decision-making portion, all the unmanned systems are different according to different types and capacities, and because the inputs of all the unmanned systems are identical, the identical decision-making can be ensured to be produced in collaborative decision-making.
The current unmanned system generally only has certain program control capability, can autonomously execute some simple tasks, improves the program control capability under the condition of not changing the software structure of the unmanned system, adds a task state machine module for each unmanned system, and can improve the autonomous speaking degree.
For specific limitations regarding the apparatus for co-completing tasks in conjunction with different unmanned systems, reference may be made to the above limitations regarding the method for co-completing tasks in conjunction with different unmanned systems, which are not described in detail herein.
Example III
Referring to fig. 3, in order to implement a method for co-completing tasks in conjunction with different unmanned systems, the present embodiment provides a system for co-completing tasks in conjunction with different unmanned systems, which requires a small adjustment to the hardware system of the unmanned system, and requires that a set of communication devices capable of communicating with a cooperative control device be added to each unmanned system.
The system specifically comprises an unmanned system, a first communication device, a second communication device and a device for cooperatively completing tasks of different unmanned systems, wherein the first communication device is electrically connected with the device for cooperatively completing tasks of different unmanned systems, the second communication device is arranged on the unmanned system, and the unmanned system is in communication connection with the device for cooperatively completing tasks of different unmanned systems through the first communication device and the second communication device.
According to the method, the two-way communication between each unmanned system platform and the cooperative control device is realized by adding the communication device, and the interconnection and intercommunication of each unmanned system are realized under the condition that the existing unmanned system is slightly modified and the software level is slightly adjusted.
Example IV
Referring to fig. 4, in order to implement a method for co-completing tasks in conjunction with different unmanned systems, this embodiment provides a system for co-completing tasks in conjunction with different unmanned systems, which allows a cooperative control device to establish a two-way communication link with an unmanned system through a network formed by network switches, and a communication module is not required to be additionally installed.
The system specifically comprises an unmanned system, a network switch and a device for cooperatively completing tasks of different unmanned systems, wherein the device for cooperatively completing tasks of different unmanned systems is in communication connection with a ground control station of the unmanned systems through the network switch.
According to the method, the two-way communication among the unmanned systems, the cooperative control device and the unmanned system control station is realized by opening the unmanned system ground station, the hardware structure is not changed, and the interconnection and the intercommunication of the unmanned systems are realized only under the condition that the software level is slightly adjusted.
Example five
The present embodiment provides a computer device comprising a memory storing a computer program and a processor implementing the steps of a method for co-operating with different unmanned systems for performing tasks when the processor executes the computer program.
Example six
The present embodiment provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of a method for co-operating with different unmanned systems to accomplish tasks.
Any combination of the technical features of the above embodiments may be performed (as long as there is no contradiction between the combination of the technical features), and for brevity of description, all of the possible combinations of the technical features of the above embodiments are not described; these examples, which are not explicitly written, should also be considered as being within the scope of the present description.
The foregoing has outlined and detailed description of the present application in terms of the general description and embodiments. It should be appreciated that numerous conventional modifications and further innovations may be made to these specific embodiments, based on the technical concepts of the present application; but such conventional modifications and further innovations may be made without departing from the technical spirit of the present application, and such conventional modifications and further innovations are also intended to fall within the scope of the claims of the present application.

Claims (10)

1. A method for co-completing tasks in conjunction with different unmanned systems, comprising:
receiving platform information and sensor information sent by a plurality of different unmanned systems, and generating a global situation according to the platform information and the sensor information;
generating an event according to the global situation and a preset event generation rule;
generating task instructions of each unmanned system according to the events, and sending each task instruction to the corresponding unmanned system; the unmanned system responds to the task instruction and completes the task after receiving the task instruction; the task instructions comprise global decision task instructions and single-machine decision task instructions.
2. The method for co-completing tasks in conjunction with different unmanned systems according to claim 1, wherein generating a global situation from the platform information and the sensor information is specifically:
carrying out filtering processing on the platform information and the sensor information by adopting a Kalman filtering algorithm;
and carrying out time alignment on the platform information and the sensor information after the filtering processing, and fusing the platform information and the sensor information by adopting a k-means algorithm and a Bayesian network to obtain a global situation.
3. The method for co-completing tasks in conjunction with disparate unmanned systems according to claim 1, wherein the global decision task instruction is generated from a system task goal and the global situation; the single-machine decision task instruction is generated according to the global decision task instruction and the self state of the unmanned system.
4. The method for co-completing tasks in conjunction with different unmanned systems according to claim 1, wherein the events comprise event type, time of generation, event coordinates, and event objects.
5. An apparatus for co-completing tasks in conjunction with different unmanned systems, comprising:
the situation information processing module is used for receiving platform information and sensor information sent by a plurality of different unmanned systems and generating a global situation according to the platform information and the sensor information;
the event generation module is used for generating an event according to the global situation and a preset event generation rule;
the unmanned system state machine module is used for generating task instructions of all unmanned systems according to the events, sending all the task instructions to corresponding unmanned systems, and responding to the task instructions and completing tasks after the unmanned systems receive the task instructions; the task instructions comprise global decision task instructions and single-machine decision task instructions.
6. The apparatus for co-operating with different unmanned systems to accomplish a task according to claim 5, wherein a plurality of unmanned system state machine modules are provided, one unmanned system for each unmanned system state machine module.
7. A system for co-operating with different unmanned systems to perform tasks, comprising a device for co-operating with different unmanned systems to perform tasks according to any of claims 5-6, a first communication device and a plurality of unmanned systems, the first communication device being electrically connected to the device for co-operating with different unmanned systems to perform tasks, each unmanned system being provided with a separate second communication device for establishing a communication connection with the device for co-operating with different unmanned systems via the first communication device, respectively.
8. A system for co-completing tasks with different unmanned systems, comprising a device for co-completing tasks with different unmanned systems according to any of claims 5-6, a network switch and a plurality of unmanned systems, wherein the device for co-completing tasks with different unmanned systems is in communication connection with a ground control station of the unmanned systems via the network switch.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 4 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 4.
CN202310617786.0A 2023-05-30 2023-05-30 Method, device and system for cooperatively completing tasks by different unmanned systems Active CN116339220B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310617786.0A CN116339220B (en) 2023-05-30 2023-05-30 Method, device and system for cooperatively completing tasks by different unmanned systems

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310617786.0A CN116339220B (en) 2023-05-30 2023-05-30 Method, device and system for cooperatively completing tasks by different unmanned systems

Publications (2)

Publication Number Publication Date
CN116339220A true CN116339220A (en) 2023-06-27
CN116339220B CN116339220B (en) 2023-08-01

Family

ID=86893348

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310617786.0A Active CN116339220B (en) 2023-05-30 2023-05-30 Method, device and system for cooperatively completing tasks by different unmanned systems

Country Status (1)

Country Link
CN (1) CN116339220B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110308740A (en) * 2019-06-28 2019-10-08 天津大学 A kind of unmanned aerial vehicle group dynamic task allocation method towards mobile target tracking
CN112561227A (en) * 2020-10-26 2021-03-26 南京集新萃信息科技有限公司 Multi-robot cooperation method and system based on recurrent neural network
CN112987778A (en) * 2021-02-03 2021-06-18 中国人民解放军军事科学院国防科技创新研究院 Heterogeneous group unmanned system collaborative task management system based on group roles
CN113316118A (en) * 2021-05-31 2021-08-27 中国人民解放军国防科技大学 Unmanned aerial vehicle cluster network self-organizing system and method based on task cognition
CN114510068A (en) * 2022-02-24 2022-05-17 北京航空航天大学 Multi-unmanned aerial vehicle collaborative situation perception method and system based on information fusion
CN114565282A (en) * 2022-03-01 2022-05-31 济南浪潮智投智能科技有限公司 Intelligent city management system based on unmanned patrol and implementation method
US11348269B1 (en) * 2017-07-27 2022-05-31 AI Incorporated Method and apparatus for combining data to construct a floor plan

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11348269B1 (en) * 2017-07-27 2022-05-31 AI Incorporated Method and apparatus for combining data to construct a floor plan
CN110308740A (en) * 2019-06-28 2019-10-08 天津大学 A kind of unmanned aerial vehicle group dynamic task allocation method towards mobile target tracking
CN112561227A (en) * 2020-10-26 2021-03-26 南京集新萃信息科技有限公司 Multi-robot cooperation method and system based on recurrent neural network
CN112987778A (en) * 2021-02-03 2021-06-18 中国人民解放军军事科学院国防科技创新研究院 Heterogeneous group unmanned system collaborative task management system based on group roles
CN113316118A (en) * 2021-05-31 2021-08-27 中国人民解放军国防科技大学 Unmanned aerial vehicle cluster network self-organizing system and method based on task cognition
CN114510068A (en) * 2022-02-24 2022-05-17 北京航空航天大学 Multi-unmanned aerial vehicle collaborative situation perception method and system based on information fusion
CN114565282A (en) * 2022-03-01 2022-05-31 济南浪潮智投智能科技有限公司 Intelligent city management system based on unmanned patrol and implementation method

Also Published As

Publication number Publication date
CN116339220B (en) 2023-08-01

Similar Documents

Publication Publication Date Title
CN112255946B (en) Ship state remote monitoring system and method based on cloud service and big data
KR101963826B1 (en) System for flying safety and fault recovery in swarm flight of unmanned aerial vehicles and method thereof
CN109961157B (en) Inspection method and system of solar photovoltaic power generation system
CN208506594U (en) Unmanned platform cluster cooperative control system
CN103971542B (en) A kind of boats and ships real-time monitoring system
Wolf et al. 360‐degree visual detection and target tracking on an autonomous surface vehicle
CN105303899A (en) Child-mother type robot cooperation system of combination of unmanned surface vessel and unmanned aerial vehicle
CN113778132B (en) Integrated parallel control platform for sea-air collaborative heterogeneous unmanned system
CN111213367B (en) Load control method and device
CN212367306U (en) Ship remote data management system and ship
Bürkle et al. Maritime surveillance with integrated systems
US20210403157A1 (en) Command and Control Systems and Methods for Distributed Assets
CN111007852A (en) System architecture of ship and intelligent ship
CN112003905A (en) Ship-shore multi-terminal data sharing method and system and intelligent ship application management system
CN115272888A (en) Digital twin-based 5G + unmanned aerial vehicle power transmission line inspection method and system
CN111260900B (en) Buoy-based multi-system heterogeneous data processing system
CN116339220B (en) Method, device and system for cooperatively completing tasks by different unmanned systems
CN114189517B (en) Heterogeneous autonomous unmanned cluster unified access management and control system
CN115334098A (en) Enterprise digital system based on industrial PaaS technology
KR100322665B1 (en) a navigation system for ship
CN205829698U (en) A kind of data transmission set
CN114793239A (en) System and method for realizing inland river intelligent ship domain controller function
CN114707304A (en) Virtual-real combined multi-unmanned aerial vehicle perception avoidance verification system and method
RU2667040C1 (en) Integrated computer system of aircraft ms-21
CN106713292A (en) Ship real-time monitoring system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant