CN112766595A - Command control device, method, system, computer equipment and medium - Google Patents

Command control device, method, system, computer equipment and medium Download PDF

Info

Publication number
CN112766595A
CN112766595A CN202110123302.8A CN202110123302A CN112766595A CN 112766595 A CN112766595 A CN 112766595A CN 202110123302 A CN202110123302 A CN 202110123302A CN 112766595 A CN112766595 A CN 112766595A
Authority
CN
China
Prior art keywords
scene
data
action
module
simulation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110123302.8A
Other languages
Chinese (zh)
Other versions
CN112766595B (en
Inventor
常予
蔡明春
苏磊
张蕴弛
李梦芯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Electronic System Engineering
Original Assignee
Beijing Institute of Electronic System Engineering
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Electronic System Engineering filed Critical Beijing Institute of Electronic System Engineering
Priority to CN202110123302.8A priority Critical patent/CN112766595B/en
Publication of CN112766595A publication Critical patent/CN112766595A/en
Application granted granted Critical
Publication of CN112766595B publication Critical patent/CN112766595B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/27Replication, distribution or synchronisation of data between databases or within a distributed database system; Distributed database system architectures therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/51Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06311Scheduling, planning or task assignment for a person or group
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Tourism & Hospitality (AREA)
  • General Engineering & Computer Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Development Economics (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Quality & Reliability (AREA)
  • Operations Research (AREA)
  • Game Theory and Decision Science (AREA)
  • Educational Administration (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Medical Informatics (AREA)
  • Computer Hardware Design (AREA)
  • Primary Health Care (AREA)
  • Geometry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a command control device, a method, a system, computer equipment and a medium. Wherein, a specific implementation of the command control device includes: the perception information input module is used for acquiring multi-class scene perception data; a distributed cloud processing module for processing and identifying the scene awareness data to generate at least one scene information database corresponding to the scene awareness data; the human-computer interaction module is used for acquiring a moving target output by a user based on the scene perception data; the parallel simulation module is used for carrying out simulation according to the scene information database and the action target to generate simulation deduction data; the situation prediction module is used for inputting the scene information database and the action target into a trained prediction model to generate situation prediction information; and the task output module is used for generating action tasks according to the deduction data and the prediction information.

Description

Command control device, method, system, computer equipment and medium
Technical Field
The present invention relates to the field of command and control technologies, and in particular, to a command control apparatus, method, system, computer device, and medium for an unmanned mobile unit cluster.
Background
In the intelligent battle era, the scene perception data is heterogeneous in mass, the unmanned platform is widely applied, the action space is accelerated and expanded from the physical domain and the information domain to the cognitive domain, higher requirements are provided for the accuracy, the timeliness and the effectiveness of command control, and the command control of an intelligent unmanned action unit cluster is trended.
Accordingly, there is a need for new command and control apparatus, methods, systems, computer devices, and media.
Disclosure of Invention
In order to solve at least one of the above problems, a first embodiment of the present invention provides a command control apparatus, including:
the perception information input module is used for acquiring multi-class scene perception data;
a distributed cloud processing module for processing and identifying the scene awareness data to generate at least one scene information database corresponding to the scene awareness data;
the human-computer interaction module is used for acquiring a moving target output by a user based on the scene perception data;
the parallel simulation module is used for carrying out simulation according to the scene information database and the action target to generate simulation deduction data;
the situation prediction module is used for inputting the scene information database and the action target into a trained prediction model to generate situation prediction information;
and the task output module is used for generating action tasks according to the deduction data and the prediction information.
Further, the parallel simulation module is further configured to output simulation deduction data to the situation prediction module;
the situation prediction module is further used for expanding the scene information database according to the simulation deduction data and inputting the expanded scene information database and the action target into a trained prediction model to generate situation prediction information.
Further, the distributed cloud processing module comprises:
the filtering processing unit is used for filtering scene perception data, and the scene perception data comprises list perception data and image perception data;
the sliding window processing unit is used for cutting, classifying and connecting the filtered list sensing data;
and the image processing unit is used for identifying, matching and labeling the filtered image scene perception data.
Further, the prediction model is generated by training through machine learning, and the machine learning includes one or more of reinforcement learning, deep learning and deep reinforcement learning.
Further, the parallel simulation module is further configured to generate a multiple simulation model by using a twin service technology and a simulation service technology, and generate simulation deduction data according to the scene information database and the action target.
Further, the scene awareness data includes: state data and environmental data acquired by a plurality of sensors;
the scene information database includes: the system comprises a blue-side action situation database, a red-side action situation database and a scene environment database, wherein the scene environment database compensates the blue-side action situation database and the red-side action situation database;
the action task comprises the following steps: action planning, military force distribution grouping and unit node allocation.
A second embodiment of the present invention provides a method for controlling by using the command control device, including:
acquiring multi-class scene perception data;
processing and identifying the scene awareness data to generate at least one scene information database corresponding to the scene awareness data;
acquiring a moving target output by a user based on the scene perception data;
simulating according to the scene information database and the action target to generate simulation deduction data;
inputting a scene information database and the action target into a trained prediction model to generate situation prediction information;
and generating an action task according to the deduction data and the prediction information.
A third embodiment of the present invention provides a command control system, including:
at least one command and control device as described above;
and at least two action units which are communicated with each other, wherein the action units acquire and execute action tasks of the command control device through a communication network.
A fourth embodiment of the invention provides a computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, is adapted to carry out the method as described above.
A fifth embodiment of the invention provides a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method as described above when executing the program.
The invention has the following beneficial effects:
aiming at the existing problems, the invention sets a command control device, a method, a system, computer equipment and a medium, performs distributed storage on scene perception data through a distributed cloud processing module, forms a large-scale database by using a parallel simulation module and performs deduction, and generates situation prediction information under a training simulation environment by using a situation prediction module, thereby generating action tasks closely related to the scene perception data, realizing the real-time acquisition of the scene perception data and the real-time perception and prediction of the global scene situation, and having wide application prospect.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a structural framework diagram of a command control device according to an embodiment of the present invention;
FIG. 2 shows a flow chart of a method of directing according to an embodiment of the invention;
fig. 3 is a structural framework diagram of a command control system according to a specific example of the present invention;
FIG. 4 illustrates a structural framework diagram of a communication system between a command control device and an action subunit of one embodiment of the present invention;
fig. 5 is a schematic structural diagram of a computer device according to another embodiment of the present invention.
Detailed Description
In order to more clearly illustrate the invention, the invention is further described below with reference to preferred embodiments and the accompanying drawings. Similar parts in the figures are denoted by the same reference numerals. It is to be understood by persons skilled in the art that the following detailed description is illustrative and not restrictive, and is not to be taken as limiting the scope of the invention.
It is to be noted that the relational terms such as first and second, and the like, described herein are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
As shown in fig. 1, a command control device according to a first embodiment of the present invention includes:
the perception information input module is used for acquiring multi-class scene perception data;
a distributed cloud processing module for processing and identifying the scene awareness data to generate at least one scene information database corresponding to the scene awareness data;
the human-computer interaction module is used for acquiring a moving target output by a user based on the scene perception data;
the parallel simulation module is used for carrying out simulation according to the scene information database and the action target to generate simulation deduction data;
the situation prediction module is used for inputting the scene information database and the action target into a trained prediction model to generate situation prediction information;
and the task output module is used for generating action tasks according to the deduction data and the prediction information.
In the embodiment, the scene perception data is stored in a distributed mode through the distributed cloud processing module, a large database is formed through the parallel simulation module and is deduced, the situation prediction information is generated through the situation prediction module in the training simulation environment, action tasks closely related to the scene perception data are generated, real-time collection of the scene perception data and real-time perception and prediction of the global scene situation are achieved, and the application prospect is wide.
In an alternative embodiment, the scene awareness data comprises: status data and environmental data acquired using a number of sensors.
In a specific example, in a scene facing an unmanned mobile unit cluster, various sensing and detecting devices such as a radar, a detection machine, an unmanned aerial vehicle and the like are utilized to sense equipment and action tracks of both red and blue parties, so that state data such as azimuth angles, speeds, coordinates, distances and the like of both red and blue parties are obtained; and environmental data such as weather, wind power, temperature and the like are sensed through environmental detection devices such as temperature sensors, pressure sensors, speed sensors and the like, so that real-time acquisition of scene front-end data and real-time analysis and monitoring of scene situations are realized. The environment data and the state data are transmitted to the distributed cloud processing module through communication modes such as a satellite, a base station or a routing node. In one specific example, as shown in fig. 3, the perception data further includes historical data related to previous actions, basic data, superior command information, and status data and environmental data acquired using sensors.
In an alternative embodiment, the distributed cloud processing module comprises:
the filtering processing unit is used for filtering scene perception data, and the scene perception data comprises list perception data and image perception data;
the sliding window processing unit is used for cutting, classifying and connecting the filtered list sensing data;
and the image processing unit is used for identifying, matching and labeling the filtered image scene perception data.
In this embodiment, in order to obtain accurate scene sensing data, filtering is performed on the collected scene sensing data output by a waveform signal, so as to improve the signal-to-noise ratio of the scene sensing data, and further, sliding window processing is performed on the list type scene sensing data, so as to realize cutting, classification and connection of the data, identify the picture type data, realize identification, matching and labeling of the image type scene sensing data, and finally form a scene information database of multiple types of scene information with accurate data and wide types.
In an optional embodiment, the context information database comprises: the system comprises a blue-side action situation database, a red-side action situation database and a scene environment database, wherein the scene environment database compensates the blue-side action situation database and the red-side action situation database.
In a specific example, after the acquired scene perception data is processed and classified by the distributed processing module, a clustered blue-side action situation database, a clustered red-side action situation database and a clustered scene environment database are formed, and further, in the embodiment, the enemy-red-side action situation database is compensated and fused by using the scene environment database, and the influence of the environment data on the enemy-red side is fully considered, so that the scene information database is more accurate, and the real-time acquisition of the scene information and the accurate generation of the global situation are realized.
In a specific example, as shown in fig. 3, the distributed cloud processing module is a distributed cloud platform, and can quickly store and process various types of data from the front end of a scene, so as to realize remote layout of a command device and quick transmission of information.
When the perception information input module transmits the scene perception data to the distributed cloud processing module, the perception information input module also transmits the scene perception data to the man-machine interaction module, and a commander acquires the scene perception data output by the man-machine interaction module and issues a moving target according to the current scene situation. In one specific example, as shown in FIG. 3, the human-computer interaction module is a human-computer interaction platform.
The parallel simulation module carries out simulation deduction according to the scene information database output by the distributed operation processing module and the action target forwarded by the man-machine interaction module to generate simulation deduction data which takes the action target as a result and can take scene information data as a condition, the simulation deduction data is a large amount of simulation data set, and the possibility of realizing the action target in the current situation can be determined according to the simulation deduction data.
Meanwhile, the situation prediction module inputs the scene information database and the action target into a trained prediction model to generate situation prediction information. In an alternative embodiment, the prediction model is generated by training using machine learning, and the machine learning includes one or more of reinforcement learning, deep learning, and deep reinforcement learning.
In a specific example, in the training process of the prediction model, a historical scene information database provides a training data basis for large-scale training, testing and evaluation of machine learning, and analysis such as maneuver type identification, flight trajectory prediction, threat level classification and intelligent agent action planning is realized by using an artificial intelligence algorithm (deep learning, reinforcement learning, deep reinforcement learning and the like), so that situation prediction information with high precision is obtained.
In an alternative embodiment, as shown in fig. 1, the parallel simulation module is further configured to output simulation deduction data to the situation prediction module;
the situation prediction module is further used for expanding the scene information database according to the simulation deduction data and inputting the expanded scene information database and the action target into a trained prediction model to generate situation prediction information.
In this embodiment, the parallel simulation module intelligently and parallelly simulates the scene perception data and the action target in the multiple digital twin space, obtains a large amount of simulated scene perception data through parallel simulation, and further expands the scene information database by using the simulated scene perception data, so that the situation prediction information output by the prediction model according to the scene information database is more accurate.
In this embodiment, the scene information database of the situation prediction model is associated with the simulation deduction data of the parallel simulation module, the scene information database of the situation prediction model is further expanded through a large amount of simulation deduction data obtained by the parallel simulation module, the robustness of machine learning is improved, the accuracy of the situation prediction information is improved, meanwhile, the situation prediction information can further correct the scene situation development deduced by the parallel simulation module, and the reliability of parallel deduction is ensured.
In an optional embodiment, the parallel simulation module is further configured to generate a multiple simulation model using a twin service technique and a simulation service technique, and generate simulation deduction data according to the scene information database and the action target.
The parallel simulation module of the embodiment generates a multiple twin space by using a twin service technology, performs multiple simulation deduction in the multiple twin space according to the scene information database and the action target, and obtains a large amount of simulation deduction data of scene simulation through parallel simulation, so that on one hand, the simulation deduction data can obtain multiple action schemes under the action target, and the success probability of the action target is improved; on the other hand, the simulation deduction data can expand a training database of the artificial intelligence model, and accuracy of scene situation prediction is improved.
Simulation deduction data deduced by the parallel simulation module and situation prediction information generated by the situation prediction module are output to the task output module, and the task output module generates a multi-element action task for commanding the action units according to scene friend or foe situation prediction information and parallel deduction results of various action schemes. In an alternative embodiment, the action task includes: action planning, military force distribution grouping and unit node allocation.
The scene perception data is stored in a distributed mode through the distributed cloud processing module, the large-scale database is formed through the parallel simulation module and is deduced, the situation prediction information is generated through the situation prediction module in the training simulation environment, action tasks closely related to the scene perception data are generated, real-time collection of the scene perception data and real-time perception and prediction of the global scene situation are achieved, and the method has a wide application prospect.
Therefore, the distributed command device of the embodiment improves the reliability and expandability of the command device for dealing with complex scenes, and improves the reaction capability of the command system; and the parallel simulation module is combined with situation prediction to enhance the perception capability and prediction capability of the command control device on scene situation, and improve the autonomous controllable capability of the command control device and the intelligent action level of coping with novel scenes.
Corresponding to the aforesaid command device, as shown in fig. 2, another embodiment of the present invention further provides a method for controlling by using the aforesaid command control device, including:
acquiring multi-class scene perception data;
processing and identifying the scene awareness data to generate at least one scene information database corresponding to the scene awareness data;
acquiring a moving target output by a user based on the scene perception data;
simulating according to the scene information database and the action target to generate simulation deduction data;
inputting a scene information database and the action target into a trained prediction model to generate situation prediction information;
and generating an action task according to the deduction data and the prediction information.
According to the method for conducting command control by using the command control device, the scene perception data is stored in a distributed mode through the distributed cloud processing module, the large-scale database is formed by the parallel simulation module and is deduced, the situation prediction information is generated by the situation prediction module in the training simulation environment, so that action tasks closely related to the scene perception data are generated, real-time collection of the scene perception data and real-time perception and prediction of the global scene situation are achieved, and the method has a wide application prospect.
Since the command control method provided in the embodiments of the present application corresponds to the command control devices provided in the above-mentioned several embodiments, the foregoing embodiments are also applicable to the command control method provided in the embodiments, and detailed description is omitted in this embodiment.
Corresponding to the aforesaid command device, as shown in fig. 3, another embodiment of the present invention further provides a command control system, including:
at least one command and control device as described above;
and at least two action units which are communicated with each other, wherein the action units acquire and execute action tasks of the command control device through a communication network.
The command control devices provided by the invention can be distributed to the scene frontier, each command control device is an intelligent node, the command control devices are equal in status and independent from each other, and can realize grouping networking, so that the scene viability of the command control system is improved. Meanwhile, the modularized command control mode breaks through the traditional centralized command control mode of step-by-step transmission, and efficient and rapid decision making of the command control process is realized.
In this embodiment, each mobile unit maintains communication within the coverage area of the mobile station, and can rapidly allocate a plurality of types of mobile units according to the action task, and use an iterative optimization and coordination (OODA) to form a complete dynamic action formation for executing the action task. In one specific example, the action unit includes: an unmanned mobile unit, a manned mobile unit, and a hybrid mobile unit. The action unit of the embodiment of the invention can allocate the force resources in a self-maintenance and self-management mode, realize autonomous cooperation and dynamic formation, form a large-scale dynamic killing network and greatly improve the efficiency of accurate action strike and dynamic decision.
In a specific example, a specific implementation manner of the communication network system proposed in this embodiment is shown in fig. 4, and includes:
satellite networks, communication devices, and wireless base stations.
Wherein, the communication equipment includes: data gateways, central routers, ethernet switches, distributed servers, and authorization servers. The communication equipment is connected with the multi-path wireless base station through a satellite network and is connected with the wired switch at the same time, so that the communication of the commander terminal, the task output module, the distributed cloud processing module and each action unit is realized.
The communication system can eliminate the bottleneck among all basic communication units, obtain flexible configuration, efficient scheduling and top management of computing resources, better realize interconnection and intercommunication of a scene command control system, and also provide technical support for integration of information and fire of a weapon system.
Another embodiment of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements: acquiring multi-class scene perception data; processing and identifying the scene awareness data to generate at least one scene information database corresponding to the scene awareness data; acquiring a moving target output by a user based on the scene perception data; simulating according to the scene information database and the action target to generate simulation deduction data; inputting a scene information database and the action target into a trained prediction model to generate situation prediction information; and generating an action task according to the deduction data and the prediction information.
In practice, the computer-readable storage medium may take any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present embodiment, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
As shown in fig. 5, another embodiment of the present invention provides a schematic structural diagram of a computer device. The computer device 12 shown in FIG. 5 is only an example and should not bring any limitations to the functionality or scope of use of embodiments of the present invention.
As shown in FIG. 5, computer device 12 is in the form of a general purpose computing device. The components of computer device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Computer device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)30 and/or cache memory 32. Computer device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 5, and commonly referred to as a "hard drive"). Although not shown in FIG. 5, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. Memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.
Computer device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with computer device 12, and/or with any devices (e.g., network card, modem, etc.) that enable computer device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, computer device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via network adapter 20. As shown in FIG. 5, the network adapter 20 communicates with the other modules of the computer device 12 via the bus 18. It should be appreciated that although not shown in FIG. 5, other hardware and/or software modules may be used in conjunction with computer device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processor unit 16 executes various functional applications and data processing by executing programs stored in the system memory 28, for example, to implement a command control method provided by an embodiment of the present invention.
It should be understood that the above-mentioned embodiments of the present invention are only examples for clearly illustrating the present invention, and are not intended to limit the embodiments of the present invention, and it will be obvious to those skilled in the art that other variations or modifications may be made on the basis of the above description, and all embodiments may not be exhaustive, and all obvious variations or modifications may be included within the scope of the present invention.

Claims (10)

1. A command control device, comprising:
the perception information input module is used for acquiring multi-class scene perception data;
a distributed cloud processing module for processing and identifying the scene awareness data to generate at least one scene information database corresponding to the scene awareness data;
the human-computer interaction module is used for acquiring a moving target output by a user based on the scene perception data;
the parallel simulation module is used for carrying out simulation according to the scene information database and the action target to generate simulation deduction data;
the situation prediction module is used for inputting the scene information database and the action target into a trained prediction model to generate situation prediction information;
and the task output module is used for generating action tasks according to the deduction data and the prediction information.
2. The apparatus of claim 1,
the parallel simulation module is also used for outputting simulation deduction data to the situation prediction module;
the situation prediction module is further used for expanding the scene information database according to the simulation deduction data and inputting the expanded scene information database and the action target into a trained prediction model to generate situation prediction information.
3. The apparatus of claim 2, wherein the distributed cloud processing module comprises:
the filtering processing unit is used for filtering scene perception data, and the scene perception data comprises list perception data and image perception data;
the sliding window processing unit is used for cutting, classifying and connecting the filtered list sensing data;
and the image processing unit is used for identifying, matching and labeling the filtered image scene perception data.
4. The apparatus of claim 2,
the prediction model is generated by training by utilizing machine learning, and the machine learning comprises one or more of reinforcement learning, deep learning and deep reinforcement learning.
5. The apparatus of claim 2,
the parallel simulation module is further used for generating a multiple simulation model by utilizing a twin service technology and a simulation service technology and generating simulation deduction data according to the scene information database and the action target.
6. The device according to any one of claims 1 to 5,
the scene awareness data includes: state data and environmental data acquired by a plurality of sensors;
the scene information database includes: the system comprises a blue-side action situation database, a red-side action situation database and a scene environment database, wherein the scene environment database compensates the blue-side action situation database and the red-side action situation database;
the action task comprises the following steps: action planning, military force distribution grouping and unit node allocation.
7. A method of control using the command control device of any one of claims 1-6, comprising:
acquiring multi-class scene perception data;
processing and identifying the scene awareness data to generate at least one scene information database corresponding to the scene awareness data;
acquiring a moving target output by a user based on the scene perception data;
simulating according to the scene information database and the action target to generate simulation deduction data;
inputting a scene information database and the action target into a trained prediction model to generate situation prediction information;
and generating an action task according to the deduction data and the prediction information.
8. A command control system, comprising:
at least one command and control device according to any one of claims 1 to 6;
and at least two action units which are communicated with each other, wherein the action units acquire and execute action tasks of the command control device through a communication network.
9. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method as claimed in claim 7.
10. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method as claimed in claim 7 when executing the program.
CN202110123302.8A 2021-01-29 2021-01-29 Command control device, method, system, computer equipment and medium Active CN112766595B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110123302.8A CN112766595B (en) 2021-01-29 2021-01-29 Command control device, method, system, computer equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110123302.8A CN112766595B (en) 2021-01-29 2021-01-29 Command control device, method, system, computer equipment and medium

Publications (2)

Publication Number Publication Date
CN112766595A true CN112766595A (en) 2021-05-07
CN112766595B CN112766595B (en) 2023-09-29

Family

ID=75706589

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110123302.8A Active CN112766595B (en) 2021-01-29 2021-01-29 Command control device, method, system, computer equipment and medium

Country Status (1)

Country Link
CN (1) CN112766595B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113438303A (en) * 2021-06-23 2021-09-24 南京孩乐康智能科技有限公司 Remote auxiliary work system and method, electronic device and storage medium
CN114201042A (en) * 2021-11-09 2022-03-18 北京电子工程总体研究所 Distributed comprehensive integrated workshop device, system, construction method and interaction method
CN114819458A (en) * 2021-12-31 2022-07-29 第四范式(北京)技术有限公司 Simulation model construction method and simulation model construction device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090192962A1 (en) * 2008-01-30 2009-07-30 Rigdon Debra A Intelligent threat assessment module, method and system for space situational awareness system
US20170146991A1 (en) * 2015-11-24 2017-05-25 Northrop Grumman Systems Corporation Spatial-temporal forecasting for predictive situational awareness
CN109598066A (en) * 2018-12-05 2019-04-09 百度在线网络技术(北京)有限公司 Effect evaluation method, device, equipment and the storage medium of prediction module
CN110288160A (en) * 2019-06-27 2019-09-27 北京华如科技股份有限公司 A kind of situation dynamic prediction method based on parallel simulation
CN111429583A (en) * 2020-03-23 2020-07-17 北京智汇云舟科技有限公司 Space-time situation perception method and system based on three-dimensional geographic information

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090192962A1 (en) * 2008-01-30 2009-07-30 Rigdon Debra A Intelligent threat assessment module, method and system for space situational awareness system
US20170146991A1 (en) * 2015-11-24 2017-05-25 Northrop Grumman Systems Corporation Spatial-temporal forecasting for predictive situational awareness
CN109598066A (en) * 2018-12-05 2019-04-09 百度在线网络技术(北京)有限公司 Effect evaluation method, device, equipment and the storage medium of prediction module
CN110288160A (en) * 2019-06-27 2019-09-27 北京华如科技股份有限公司 A kind of situation dynamic prediction method based on parallel simulation
CN111429583A (en) * 2020-03-23 2020-07-17 北京智汇云舟科技有限公司 Space-time situation perception method and system based on three-dimensional geographic information

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
周芳;毛少杰;吴云超;李玉萍;: "实时态势数据驱动的平行仿真推演方法", 中国电子科学研究院学报, no. 04, pages 33 - 38 *
李强等: "指挥信息系统数据泄露态势感知研究", 《第七届中国指挥控制大会论文集》, pages 585 - 589 *
蔡明春等: "智能化战争形态及其支撑技术体系", 《国防科技》, vol. 38, no. 1, pages 94 - 98 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113438303A (en) * 2021-06-23 2021-09-24 南京孩乐康智能科技有限公司 Remote auxiliary work system and method, electronic device and storage medium
CN114201042A (en) * 2021-11-09 2022-03-18 北京电子工程总体研究所 Distributed comprehensive integrated workshop device, system, construction method and interaction method
CN114201042B (en) * 2021-11-09 2023-09-15 北京电子工程总体研究所 Distributed comprehensive integrated seminar device, system, construction method and interaction method
CN114819458A (en) * 2021-12-31 2022-07-29 第四范式(北京)技术有限公司 Simulation model construction method and simulation model construction device

Also Published As

Publication number Publication date
CN112766595B (en) 2023-09-29

Similar Documents

Publication Publication Date Title
CN112766595B (en) Command control device, method, system, computer equipment and medium
CN110363058B (en) Three-dimensional object localization for obstacle avoidance using one-shot convolutional neural networks
CN109145680A (en) A kind of method, apparatus, equipment and computer storage medium obtaining obstacle information
WO2020147500A1 (en) Ultrasonic array-based obstacle detection result processing method and system
CN112650300B (en) Unmanned aerial vehicle obstacle avoidance method and device
CN113037783B (en) Abnormal behavior detection method and system
Hentati et al. Mobile target tracking mechanisms using unmanned aerial vehicle: Investigations and future directions
CN110188766A (en) Image major heading detection method and device based on convolutional neural networks
CN113723378A (en) Model training method and device, computer equipment and storage medium
CN114091515A (en) Obstacle detection method, obstacle detection device, electronic apparatus, and storage medium
Blasch Autonomy in use for information fusion systems
CN112114969A (en) Data processing method and device, electronic equipment and storage medium
CN112486676A (en) Data sharing and distributing method and device based on edge calculation
Kulbacki et al. Intelligent video monitoring system with the functionality of online recognition of people’s behavior and interactions between people
CN111610850A (en) Method for man-machine interaction based on unmanned aerial vehicle
CN113722045B (en) Cluster application deployment method and device
CN115482712A (en) Programmable group robot framework based on 5G network
Rybak et al. Increasing the information superiority on the modern battlefield through the use of virtual reality systems
CN111898564A (en) Time sequence convolution network model, model training and target identification method and device
CN111753960A (en) Model training and image processing method and device, electronic equipment and storage medium
CN111563596A (en) Uncertain information reasoning target identification method based on evidence network
CN116540568B (en) Large-scale distributed unmanned aerial vehicle cluster simulation system
CN113031600B (en) Track generation method and device, storage medium and electronic equipment
CN117991757B (en) Unmanned aerial vehicle control method and system for heterogeneous airborne radar signals
CN112748746B (en) Method, device, equipment and storage medium for acquiring flight information of aircraft

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant