CN112150635A - AR individual soldier and robot mixed-editing combat system and method based on digital twin - Google Patents

AR individual soldier and robot mixed-editing combat system and method based on digital twin Download PDF

Info

Publication number
CN112150635A
CN112150635A CN202010910620.4A CN202010910620A CN112150635A CN 112150635 A CN112150635 A CN 112150635A CN 202010910620 A CN202010910620 A CN 202010910620A CN 112150635 A CN112150635 A CN 112150635A
Authority
CN
China
Prior art keywords
robot
ground
real
combat
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010910620.4A
Other languages
Chinese (zh)
Inventor
张晶晶
赵本发
李莹
高玉水
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing J&t Simulation Technology Co ltd
Original Assignee
Beijing J&t Simulation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing J&t Simulation Technology Co ltd filed Critical Beijing J&t Simulation Technology Co ltd
Priority to CN202010910620.4A priority Critical patent/CN112150635A/en
Publication of CN112150635A publication Critical patent/CN112150635A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41HARMOUR; ARMOURED TURRETS; ARMOURED OR ARMED VEHICLES; MEANS OF ATTACK OR DEFENCE, e.g. CAMOUFLAGE, IN GENERAL
    • F41H11/00Defence installations; Defence devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Abstract

The invention discloses an AR individual soldier and robot mixed-compiling battle system based on digital twins and a method thereof, wherein the system comprises: a ground combat robot; the AR display device can construct an AR three-dimensional virtual space, fuse virtual and real states to display a real battlefield environment, personnel equipment and a virtual target, perform interactive operation on the virtual 3D ground combat robot, display an AR electronic sand table and display battlefield situation; the wearable device is in real-time communication with the ground combat robot through a data chain and is connected with the AR display device; and the AR application system is used for quickly constructing the ground combat robot and the real battlefield environment into an AR three-dimensional virtual space, so that the function of the 3D ground combat robot in the form of a leader sheep to one or more real ground combat robots is realized. The wearable AR glasses replace handheld control equipment, so that the two hands of the control hands can be liberated, the control hands can conveniently and rapidly switch task roles, and weapons such as rifles and the like are used for fighting.

Description

AR individual soldier and robot mixed-editing combat system and method based on digital twin
Technical Field
The invention relates to the technical field of future ground intelligent battles, in particular to an AR individual soldier and robot mixed-editing battle system and method based on digital twins.
Background
In future light-duty high-mobility operations, advanced operation concepts such as unmanned operations and collective operations are taken as a core, and the grouping and cooperative operations of the manned operation system and the unmanned operation system based on unmanned technology and artificial intelligence technology become necessary trends of future war development.
From public reports, the Six3 company designed, developed and validated a systematic prototype of "joint arms class" with human and robotic cooperative combat capabilities, which would combine human and unmanned control equipment, ubiquitous communications and information, and advanced combat capabilities in various areas to maximize the combat level of a shift in an increasingly complex combat environment. The robot and autonomous system strategy embeds the robot and autonomous system into a battle formation, the long-term targets (2030 and 2040 years) are not limited to that of each robot, and when the small robot cannot complete a certain task by the individual capacity, the tasks are continuously pushed to be completed through cooperation, combination and the like.
The ground combat robot controls the combat tasks of marching, reconnaissance, mine clearance, explosion clearance, rescue, accurate shooting and the like of the robot by an operation hand through an operation rod and a panel component of the portable radio remote control equipment; the high-brightness display screen of the remote control equipment is used for displaying an optical image of the robot photoelectric equipment, and a scout image and data of the robot photoelectric equipment can be automatically or semi-automatically uploaded to a commander; the program control device of the ground combat robot can carry out autonomous traveling, automatic image recognition and tracking according to path planning. The main defects and shortcomings are as follows:
1) the portable control device has certain volume and weight, and the carrying and hand-held operation of the portable control device occupies the two hands of the control hand, which is not beneficial to the control hand to rapidly switch task roles, such as using weapons like a rifle to fight;
2) the portable control equipment is limited by the volume and the weight, and a display screen of the portable control equipment is small, so that a control hand is inconvenient to observe images, identify and track targets, display battlefield situations, plan the path of a ground combat robot and the like;
3) the display screen of the portable control equipment can only display two-dimensional images, can not display three-dimensional images such as a three-dimensional electronic sand table and the like, and can not construct a virtual three-dimensional space to carry out ground operation robot operation control skill training, operation scheme deduction, operation command training and the like in a three-dimensional virtual environment.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a digital twin-based AR individual soldier and robot mixed-programming combat system and a method thereof, which aim to solve the defects pointed out in the background technology, replace the portable radio remote control equipment of the ground combat robot with AR glasses with three-dimensional virtual image display and gesture recognition control functions, and carry out combat tasks of marching, reconnaissance, mine clearance, explosion relief, rescue, accurate shooting and the like on the ground combat robot in a virtual three-dimensional space constructed by the AR glasses.
In order to solve the technical problems, the technical scheme adopted by the invention is as follows.
AR individual soldier and robot mixed-marshalling system based on digital twin includes:
the ground combat robot can accompany marching and can go forward to reconnaissance, combat or execute other tasks;
the AR display device can construct an AR three-dimensional virtual space, fuse virtual and real states to display a real battlefield environment, personnel equipment and a virtual target, perform interactive operation on the virtual 3D ground combat robot, display an AR electronic sand table and display battlefield situation;
the wearable device is in real-time communication with the ground combat robot through a data chain, receives the operation parameters of the ground combat robot and is connected with the AR display device; the wearable device can perform navigation positioning on the control hand and perform real-time plotting on an AR electronic sand table; and
the AR application system can rapidly construct the ground combat robot and the real battlefield environment into an AR three-dimensional virtual space, realizes that the 3D ground combat robot and the real ground combat robot share the same space-time, has a virtual-real mapping relation, can carry out digital twin type mixed combat of AR individual soldiers and the ground combat robot, and realizes the function of a 'leader sheep' of the 3D ground combat robot to a single or a plurality of real ground combat robots.
According to the technical scheme, the ground combat robot comprises an unmanned vehicle, an automatic pilot, a vehicle-mounted weapon, a photoelectric rotating tower, a servo control system and a data chain vehicle-mounted terminal, wherein the data chain vehicle-mounted terminal is respectively connected with the automatic pilot, the vehicle-mounted weapon, the photoelectric rotating tower and the servo system of the ground combat robot through standard interfaces.
According to the technical scheme, in an AR three-dimensional virtual space constructed by the AR display equipment, a control hand can interactively operate a photoelectric turret and an autopilot of the 3D ground combat robot to perform photoelectric reconnaissance, tracking aiming, vehicle movement and weapon shooting; the real ground operation robot and the 3D ground operation robot have a mapping relation, can copy the action of the 3D ground operation robot, execute marching, reconnaissance, mine clearance, explosion clearance, rescue and accurate shooting operation tasks, and are used for the control skill training, operation scheme deduction and operation command training of the ground operation robot.
According to the technical scheme, the wearable equipment can be inserted into left/right arm pockets of the camouflage clothes or side pockets of tactical backpacks; the wearing apparatus includes:
the individual navigation positioning system is used for navigating and positioning the control hand, the data of the individual navigation positioning system is automatically positioned on the AR electronic sand table, and the position of the control hand is displayed;
the data link individual terminal realizes the real-time communication of remote control, remote measurement information and scout images and is respectively connected with the individual navigation positioning system and the AR glasses through a standard interface; data transmission is carried out between the data chain individual terminal and the data chain vehicle-mounted terminal; and
and the power supply module is used for providing electric energy.
According to the technical scheme, the position and speed parameters of the control hand can drive the real ground fighter robot to go forward in reconnaissance and attack, and the distance and the direction between the control hand and the real ground fighter robot are randomly generated within a certain range every time.
Further optimizing the technical scheme, wherein the AR display device is AR glasses; the AR glasses comprise an AR optical module, a microprocessor, an image acquisition module, a sensor, a communication module and a structural component;
the microprocessor is used for constructing and controlling an AR three-dimensional virtual space and a 3D ground combat robot;
the image acquisition module is used for processing depth-of-field images and recognizing gestures;
the sensors comprise a head tracking sensor, a hand tracking sensor, an eye tracking sensor, an iris distance measuring sensor, a voice AI chip and a video AI chip;
the communication module comprises a wifi communication module and a Bluetooth communication module;
the structural component is used for installing and positioning the AR optical module, the microprocessor, the image acquisition module, the sensor and the communication module, and is integrated on an individual soldier goggles or a helmet directly.
Further optimize technical scheme, the AR glasses select to construct virtual space with the sense of immersion, can carry out the space mapping that virtual reality fuses, can construct AR electronic sand table fast, carry out manual tracking, eye-tracking and head tracking, can control 3D ground combat robot and photoelectricity capstan head and vehicle-mounted weapon with human natural interaction's mode.
Further optimizing the technical scheme, the AR application system comprises:
AR management software for system login, task management, virtual screen and space management, model management and database management;
the AR electronic sand table software adopts a two-dimensional or three-dimensional electronic sand table form, can display, plot and calculate the two-dimensional or three-dimensional battlefield situation of the electronic sand table, and can plan the route of the ground combat robot;
the reconnaissance and aiming software is used for displaying a reconnaissance and aiming image on a virtual screen of the AR glasses, and the reconnaissance image and the aiming division can drive a photoelectric turret of the real ground combat robot to rotate by acquiring parameters of the head tracking sensor, reconnaissance and discrimination of a suspicious target of the automatic image identification mark and driving the aiming division to aim at the target and shoot;
the 3D ground operation robot control software is used for training the operation skill of the ground operation robot, deducing an operation scheme and performing man-machine hybrid programming operation; and
on one hand, the AR digital twin software is that the position, the posture and the speed parameters of the real ground combat robot can drive the 3D ground combat robot on the AR electronic sand table to move synchronously, so that the real-time battlefield situation display is realized; or on the one hand, the 3D ground operation robot of the three-dimensional electronic sand table is controlled to move through gesture recognition, the head gesture of the control hand is sensed to control the rotation of the 3D virtual photoelectric rotating tower, so that the 3D ground operation robot and the motion parameters of the 3D virtual photoelectric rotating tower drive the motion, photoelectric reconnaissance and fire application of the real ground operation robot, and the virtual-real mapping relation between the 3D ground operation robot and the real ground operation robot is realized.
The method for the AR individual soldier and robot mixed combat system based on the digital twin is carried out based on the AR individual soldier and robot mixed combat system based on the digital twin and comprises the following steps:
s1, wearing augmented reality AR glasses by a control hand, logging in and starting a task, and quickly constructing a three-dimensional electronic sand table according to data of a digital map;
s2, setting the position of the ground fighter robot through gesture recognition or voice control, and planning the route of the ground fighter robot;
s3, high-speed transmission between the data chain vehicle-mounted terminal and the data chain individual terminal: the system comprises the following components, namely position, posture and speed parameters of a ground combat robot, visible light and infrared reconnaissance images and aiming division images of a photoelectric turret, control data of an automatic pilot of a 3D ground combat robot in an AR electronic sand table and control data of a servo system of the 3D ground combat robot;
s4, enabling AR glasses to receive position, posture and speed parameters of a real ground operation robot of the data chain, and synchronously driving and plotting the motion of the 3D ground operation robot in the AR electronic sand table through data driving to realize real-time battlefield situation display;
s5, the AR glasses receive the scouting and aiming images of the real ground combat robot of the data chain, and the scouting and aiming images are displayed on a virtual screen of the AR glasses for a control hand to scout a target and aim at for shooting;
s6, when the AR glasses watch the reconnaissance image and aim at the division, reconnaissance and discrimination are carried out on the suspicious target of the automatic image identification mark, and the aiming division is controlled to aim at the target;
s7, operating the control hand in a virtual space through gesture recognition to control virtual operation tasks of the virtual reality 3D ground operation robot, transmitting operation parameters of the 3D ground operation robot to the real ground operation robot, and controlling the real ground operation robot to move along with the 3D ground operation robot;
the head posture of the control hand controls the rotation of the 3D virtual photoelectric turret, so that the 3D ground combat robot and the motion parameters of the 3D virtual photoelectric turret drive the motion and photoelectric reconnaissance of the real ground combat robot.
Due to the adoption of the technical scheme, the technical progress of the invention is as follows.
The wearable AR glasses, the microprocessor, the micro antenna and other equipment replace control equipment of the handheld ground combat robot, the size is small, the weight is light, the portability is good, the two hands of the control hand can be liberated, the control hand can conveniently and rapidly switch task roles, and weapons such as a rifle and the like are used for fighting.
The invention constructs an immersive three-dimensional virtual space by using AR glasses, can display a two-dimensional or three-dimensional electronic sand table, can perform ground combat robot route planning, constructs a virtual control handle, a switch button and the like in the virtual space, and adopts a gesture recognition mode to operate. The reconnaissance image of the ground combat robot is displayed on the virtual screen of the AR glasses. The position and the motion parameters of the real ground combat robot are transmitted to the AR glasses in real time through a data chain, and are plotted in the three-dimensional electronic sand table in real time to display the battlefield situation in real time.
According to the invention, the 3D ground combat robot for controlling the AR electronic sand table moves through gesture recognition in the AR virtual space, the head posture of the control hand is sensitive to control the rotation of the 3D virtual photoelectric turret, and the position and the motion parameters of the 3D ground combat robot can be plotted in the three-dimensional electronic sand table in real time. Meanwhile, the position and the motion parameters of the 3D ground combat robot are used for driving the motion of the real ground combat robot, the position and the motion display of the 3D ground combat robot on the AR electronic sand table are corrected, and the extremely similar virtual-real mapping relation (digital twin) between the 3D ground combat robot and the real ground combat robot is realized.
The ground combat robot can autonomously drive according to the air route planning, even can cut off the communication connection and keep the radio silence. The vehicle can drive along with the position and the speed of the control hand, the distance and the direction between the control hand and the vehicle are randomly generated within a certain range every time, enemies can be prevented from accurately positioning the position of the control hand by using the ground combat robot, and the safety of the control hand is guaranteed. Still can set up the ground fighter robot for following the state, the motion parameter of 3D ground fighter robot goes upward and transmits the autopilot to real ground fighter robot, controls the following march of real ground fighter robot to 3D ground fighter robot, realizes that 3D ground fighter robot is to single or a plurality of real ground fighter robot "leading sheep" navigation function.
The individual soldier can also execute support and guarantee tasks such as photoelectric reconnaissance, firepower striking, fixed-point delivery and the like along with the ground combat robot; the virtual screen of the AR glasses can be opened to display the reconnaissance image and the aiming division, and the ground combat robot can be authorized to automatically track the shooting or select a manual aiming tracking and shooting mode.
The position and navigation information of the control hand and the ground combat robot can be encrypted and uploaded to a command information system, and the information such as a transmitted combat command, a battlefield situation and the like is received and displayed on AR glasses. When the control hand advances or carries out other tasks, the display positions or the hiding of the infrared night vision image of the helmet and the day and night image of the ground combat robot can be switched through voice control or gesture dragging, and information interference is avoided.
Drawings
FIG. 1 is a system block diagram of the AR individual soldier and robot mixed combat system based on digital twins;
FIG. 2 is a diagram showing the appearance of AR glasses in the digital twin-based AR individual soldier and robot mixed combat system according to the present invention;
FIG. 3 is a control connection diagram of AR optical modules in the AR individual soldier and robot mixed-marshalling combat system based on digital twinning according to the invention;
FIG. 4 is a holographic optical waveguide optical scheme of an AR optical module in the digital twin-based AR individual soldier and robot mixed-compiling battle system of the invention;
fig. 5 is a free-form surface reflection scheme of an AR optical module in the digital twin-based AR individual soldier and robot mixed combat system of the invention.
Detailed Description
The invention will be described in further detail below with reference to the figures and specific examples.
An AR individual soldier and robot mixed combat system based on digital twinning, which is shown in a combined figure 1 and comprises: the system comprises a ground combat robot, a data chain, AR display equipment, wearing equipment and an AR application system.
The ground combat robot can accompany marching and can go forward to reconnaissance, combat or perform other tasks.
The ground combat robot comprises an unmanned vehicle, an automatic pilot, a vehicle-mounted weapon, a photoelectric turret, a servo control system and a data chain vehicle-mounted terminal, wherein the data chain vehicle-mounted terminal is respectively connected with the automatic pilot, the vehicle-mounted weapon, the photoelectric turret and the servo system of the ground combat robot through standard interfaces.
AR display device can establish the three-dimensional virtual space of AR, and virtual reality fuses and shows real battlefield environment, personnel and equip and virtual target, carries out interactive operation to virtual 3D ground combat robot, can show AR electron sand table, shows battlefield attitude potential to carry out virtual reality's 3D ground combat robot and come the virtual operation function that real ground combat robot has.
In an AR three-dimensional virtual space constructed by the AR display equipment, a control hand can interactively operate a photoelectric turret and an automatic pilot of the 3D ground combat robot to perform photoelectric reconnaissance, tracking aiming, vehicle movement and weapon shooting; the real ground operation robot and the 3D ground operation robot have a mapping relation, can copy the action of the 3D ground operation robot, execute marching, reconnaissance, mine clearance, explosion clearance, rescue and accurate shooting operation tasks, and are used for the control skill training, operation scheme deduction and operation command training of the ground operation robot. The photoelectric turret can also perform image recognition and target tracking of the photoelectric turret, and is used for ground operation robot operation and control skill training, operation scheme deduction and man-machine hybrid programming operation.
The AR display device is AR glasses; AR glasses include AR optical module, microprocessor, image acquisition module, sensor, communication module and construction package.
The microprocessor is used for constructing and controlling the AR three-dimensional virtual space and the 3D ground combat robot.
The image acquisition module is used for carrying out depth of field image processing and gesture recognition.
The sensor includes that head tracking sensor, hand tracking sensor, eye movement track sensor, pitch measuring transducer, pronunciation AI chip and video AI chip, and head tracking sensor, hand tracking sensor, eye movement track sensor, pitch measuring transducer, pronunciation AI chip and video AI chip's output is connected respectively in microprocessor's input.
The communication module comprises a wifi communication module and a Bluetooth communication module.
The structural component is used for installing and positioning the AR optical module, the microprocessor, the image acquisition module, the sensor and the communication module, and is integrated on an individual soldier goggles or a helmet directly.
The AR glasses are integrated into individual goggles or directly into a helmet, and can see a virtual image generated by a microprocessor and visible and infrared images of a wireless receiving photoelectric turret or other devices on a virtual screen of the AR glasses while observing the real world in a perspective manner. Not only the virtual screen size of AR glasses is greater than the display screen of hand-held type control equipment far away, can show three-dimensional electron sand table moreover, founds the three-dimensional virtual space of AR.
The AR three-dimensional virtual space can display two three-dimensional electronic sand tables, can display two three-dimensional battlefield situations and plan the path of the ground operation robot, can perform virtual reality operation tasks of marching, reconnaissance, mine clearance, explosion clearance, rescue, accurate shooting and the like of the 3D ground operation robot, and is used for the operation skill training, operation scheme deduction, operation command training and the like of the ground operation robot.
The AR glasses can customize the myopia lenses, the original myopia glasses are replaced, the lining is installed on the helmet goggles, a large-size virtual screen can be displayed, and if the AR glasses with a certain FOV of 40 degrees and a small field angle are used, the effect of a 86-inch virtual screen at a position of 3 meters can be achieved.
The AR glasses are free-form surface AR glasses with light transmittance of more than 50% or optical waveguide AR glasses with light transmittance of more than 80%, and the condition that an operation hand observes the battlefield environment is not influenced.
The appearance of the AR glasses is shown in fig. 2. The AR optical module typically employs RGB optical illumination and LCOS reflective liquid crystal on silicon projection technology or organic light emitting diode OLED devices, as shown in fig. 3. The LCOS controller loads the processed signals into the LCOS, simultaneously transmits RGB lighting signals to the lighting control system, and the lighting control system lights the RGB LEDs and lights the LCOS. The OLED is a device for generating electroluminescence by utilizing a multilayer organic thin film structure, under the action of an electric field, holes generated by an anode and electrons generated by a cathode are respectively injected into a hole transport layer and an electron transport layer and migrate to a light emitting layer, when the electrons and the holes at the interface of the light emitting layer are accumulated to a certain number, the electrons and the holes are combined to generate excitons, the electrons at the outermost layer of the excited organic molecules are transited from a ground state to an excited state, energy is released in the process of transiting the electrons in the excited state to the ground state in the form of light, so that the light emitting molecules are excited to generate visible light, and the light emitting intensity is in direct proportion to the injected current.
The AR optical module generally adopts optical schemes such as optical waveguide or free-form surface prism, etc., fig. 4 is a holographic optical waveguide optical scheme, and fig. 5 is a free-form surface reflection scheme.
Visible light or infrared images of the photoelectric turret of the ground combat robot can be wirelessly transmitted to a virtual screen of AR glasses, and an enemy target can be remotely detected; the visible light or infrared image of the night view image intensifier of the helmet or the image sighting device of the individual weapon can be transmitted to the AR virtual screen in a wired or wireless mode, so that the night view image intensifier or the image sighting device of the individual weapon has a night view function, mechanical helmet night vision instrument switching is not needed, the use is convenient, remote control aiming of the weapon can be realized, and the shooting function of 'no head exposure' is realized.
The AR three-dimensional virtual image can construct a virtual space with immersion, can perform virtual-real fused space mapping, can rapidly construct an AR electronic sand table, can perform manual tracking, eye tracking and head tracking, and can control the 3D ground combat robot, the photoelectric turret and the vehicle-mounted weapon in a human body natural interaction mode.
Wearing equipment, through data link and ground operation robot real-time communication, receiving ground operation robot's operating parameter to be connected with AR display device through standard interface. The operation parameters comprise photoelectric images, positions and motion parameters of the ground fighter robot.
Wearing equipment can navigate the location to controlling the hand to real-time plotting on AR electron sand table shows the position of controlling the hand on AR electron sand table.
The wearing equipment adopts an integrated and miniaturized design scheme and can be inserted into a left/right arm pocket of the camouflage clothes or a side pocket of a tactical backpack. The wearable device mainly comprises an individual navigation and positioning system, a data chain individual terminal, a power module and the like.
The individual navigation positioning system is used for navigating and positioning the control hand, the data of the individual navigation positioning system is automatically positioned on the AR electronic sand table, and the position of the control hand is displayed.
The power module is used for providing electric energy.
The data chain individual terminal and the airborne terminal of the ground combat robot adopt integrated and miniaturized design, realize the real-time communication (modulation transmission and reception demodulation) of remote control, remote measurement information and reconnaissance images, and are respectively connected with the individual navigation positioning system and the AR glasses through standard interfaces. And data transmission is carried out between the data chain individual terminal and the data chain vehicle-mounted terminal.
Wherein, the visible light or infrared scout image, the aiming division and the like can be displayed on the AR virtual screen, and the automatic calibration can be carried out. The data chain adopts a working mode of combining hopping spread spectrum to improve the anti-interference capability, adopts quantum communication to realize confidential information transmission, adopts TDMA or dynamic TDMA networking control, and realizes a two-way, high-speed, confidential and anti-interference tactical data communication system.
The data link adopts a confidential and anti-interference tactical information communication system, the data link vehicle-mounted terminal is connected with an autopilot, a navigation positioning system, a photoelectric turret and a servo system of the ground combat robot through standard interfaces, and the data link individual terminal is connected with the navigation positioning system of the control hand and AR glasses through the standard interfaces.
The AR application system can rapidly construct the ground combat robot and the real battlefield environment into an AR three-dimensional virtual space, realizes that the 3D ground combat robot and the real ground combat robot share the same space-time, has a virtual-real mapping relation, can carry out digital twin type mixed combat of AR individual soldiers and the ground combat robot, and realizes the function of a 'leader sheep' of the 3D ground combat robot to a single or a plurality of real ground combat robots.
The AR application system comprises AR management software, AR electronic sand table software, reconnaissance and aiming software, 3D ground combat robot control software and AR digital twin software.
The AR management software is used for system login, task management, virtual screen and space management, model management, and database management.
The AR electronic sand table software adopts a two-dimensional or three-dimensional electronic sand table form, can display, plot and calculate the two-dimensional or three-dimensional battlefield situation of the electronic sand table, and can plan the air route of the ground combat robot. The position, the posture, the speed and other parameters of the ground combat robot can drive the 3D ground combat robot on the AR electronic sand table to plot synchronously, so that real-time battlefield situation display is realized.
The reconnaissance and aiming software displays the reconnaissance and aiming images on a virtual screen of the AR glasses, the reconnaissance images and the aiming partitions can drive a photoelectric turret of the real ground combat robot to rotate by acquiring parameters of the head tracking sensor, reconnaissance and screening suspicious targets of the automatic image identification marks, and driving the aiming partitions to aim at the targets and shoot.
The 3D ground operation robot control software is used for virtual driving and reconnaissance of the 3D ground operation robot in virtual space, image recognition and target tracking of the 3D virtual photoelectric rotating tower, and is used for ground operation robot control skill training, operation scheme deduction and man-machine mixed operation.
On one hand, the AR digital twin software is that the position, the posture and the speed parameters of the real ground combat robot can drive the 3D ground combat robot on the AR electronic sand table to move synchronously, so that the real-time battlefield situation display is realized;
or on the other hand, the 3D ground operation robot for controlling the three-dimensional electronic sand table moves through gesture recognition, the head gesture of the control hand is sensed to control the rotation of the 3D virtual photoelectric turret, so that the 3D ground operation robot and the motion parameters of the 3D virtual photoelectric turret drive the motion, photoelectric reconnaissance and fire application of the real ground operation robot, and the extremely similar virtual-real mapping relation (digital twin) between the 3D ground operation robot and the real ground operation robot is realized; the piloting function of the 3D ground combat robot to the leading sheep of a single or a plurality of real ground combat robots can also be realized.
The Digital Twin (Digital Twin) has no standard definition accepted in the industry, and the Digital Twin of the present invention is a popular concept in the industry, that is, a virtual 3D object (Digital Twin) and a physical entity (physical Twin) have a very similar virtual-real mapping relationship.
After the operating parameter of ground operation robot was gathered by wearing equipment, can carry out synchronous plotting on AR electron sand table through AR application system drive 3D ground operation robot, show battlefield situation in real time, realize the virtual and actual mapping relation of 3D ground operation robot and ground operation robot.
A method for an AR individual soldier and robot mixed-compiling battle system based on a digital twin is carried out based on the AR individual soldier and robot mixed-compiling battle system based on the digital twin, and comprises the following steps:
and S1, wearing augmented reality AR glasses by a control hand, logging in and starting a task, and quickly constructing a three-dimensional electronic sand table according to data of the digital map.
And S2, the electronic sand table has the functions of displaying, plotting, calculating and the like of two-dimensional and three-dimensional battlefield situations, and the positions of the ground fighter robots are set through gesture recognition or voice control to plan the routes of the ground fighter robots.
S3, high-speed transmission between the data chain vehicle-mounted terminal and the data chain individual terminal: the system comprises the position, posture and speed parameters of the ground operation robot, visible light and infrared reconnaissance images and aiming division images of the photoelectric turret, control data of an automatic pilot of the 3D ground operation robot in the AR electronic sand table, and control data of a vehicle-mounted weapon of the 3D ground operation robot and a servo system of the photoelectric turret.
S4, AR glasses receive the position, the posture and the speed parameters of the real ground operation robot of the data chain, and synchronous data drive plots the movement of the 3D ground operation robot in the AR electronic sand table, so that real-time battlefield situation display is achieved.
And S5, the AR glasses receive the scouting and aiming images of the real ground combat robot of the data chain, and the scouting and aiming images are displayed on a virtual screen of the AR glasses for a control hand to scout a target and aim at for shooting.
S6, when the AR glasses watch the reconnaissance image and aim the division, reconnaissance and screening the suspicious target of the automatic image identification mark, and controlling the aim division to aim the target. Parameters of the head tracking sensor can be acquired and uploaded to a data chain vehicle-mounted terminal and a photoelectric turret servo system of the ground operation robot, so that the photoelectric turret of the real ground operation robot is driven to rotate, aiming and division aiming targets are controlled, and supporting and guaranteeing tasks such as remote control aiming shooting and fixed-point throwing of an operator are provided.
S7, the virtual control equipment of the 3D ground combat robot in the AR glasses can control virtual marching, reconnaissance, mine clearance, explosion clearance, rescue, accurate shooting and other combat tasks of the virtual reality 3D ground combat robot by using a control hand to identify and operate in a virtual space, and can be used for image identification and target tracking of the 3D virtual photoelectric turret and can be used for ground combat robot control skill training, combat scheme deduction, man-machine mixed operation and the like.
The operation parameters of the 3D ground combat robot are transmitted to the real ground combat robot, and the real ground combat robot is controlled to move along with the 3D ground combat robot;
the head posture of the control hand controls the rotation of the 3D virtual photoelectric turret, so that the 3D ground combat robot and the motion parameters of the 3D virtual photoelectric turret drive the motion and photoelectric reconnaissance of the real ground combat robot.
In the AR glasses, through gesture recognition, the control hand can control the 3D ground operation robot of the three-dimensional electronic sand table to move, and the head gesture of the control hand is sensitive to control the rotation of the 3D virtual photoelectric rotating tower, so that the 3D ground operation robot and the motion parameters of the 3D virtual photoelectric rotating tower drive the motion, photoelectric reconnaissance and fire application of the real ground operation robot, and the extremely similar digital twin relationship between the 3D ground operation robot and the real ground operation robot is realized.
The piloting function of the 3D ground combat robot to the single or multiple real ground combat robots 'leading sheep' can also be realized, and the specific realization process is as follows: the real ground fighter robot is set to be in a following state, the motion parameters of the 3D ground fighter robot are transmitted to the autopilot of the real ground fighter robot in an uplink mode, and the real ground fighter robot is controlled to follow the 3D ground fighter robot to advance to the army.
The invention provides a concept of mixed operation of individual soldiers and ground operation robots, wherein the ground operation robots are completely automatic in movement, reconnaissance, shooting and the like according to a path planning or autonomous driving mode, observe, think, react and act at a higher speed than human soldiers, and the iterative evolution speed of the ground operation robots is far higher than that of humans. Therefore, the ground combat robot is the main force of directly using weapons to fight in future battlefields, but the human associative thinking and the strategy ability are not possessed by the robot, and the man-machine mixed programming combat can well solve the problem. In man-machine mixed combat, a man and a robot do not collapse together, but the ground combat robot acts suddenly, and a control hand carries out command combat at a safe position.
The weapon of the ground operation robot is controlled and shot, the shooting must be authorized by people or manually tracked, the shooting right is mastered in the hands of people, and the robot is prevented from generating a crazy uncontrolled state. The ground combat robot can carry out autonomous marching according to path planning, also can accompany the individual soldier marching, and when accompanying the marching, the satellite positioning parameter of controlling the hand can drive the real ground combat robot to go out reconnaissance and attack and go forward, and distance and position between the two all produce at random in certain within range at every turn, can prevent that enemy from utilizing ground combat robot accurate positioning to control the position of hand, has ensured the safety of controlling the hand.
When the control hand drives the ground combat robot to advance along with marching, forward reconnaissance and attack, the control hand and the ground combat robot need to be kept within a certain distance range, the relative distance and the direction of the control hand and the ground combat robot can be randomly set every task, the aim is to prevent the control hand and the ground combat robot from being locked and stricken by an enemy due to the regular relative position relationship of the control hand and the ground combat robot, and the accompanying marching of the control hand by the ground combat robot is realized. During accompanying marching, the position and the motion parameters of the control hand navigation positioning system are superposed with the preset relative distance and direction, so that the required position of the ground fighter robot can be calculated, the ground fighter robot is controlled to autonomously accompany with the control hand marching, and the walking dog type marching is realized; the ground combat robot can perform autonomous motion according to the path planning.
When the control hand advances or executes other tasks, the display positions or the hiding of the infrared night vision image of the AR glasses and the day and night image of the ground combat robot can be switched through voice control or gesture recognition dragging, so that information interference is avoided; the real-time control of a control hand is not needed, and even the communication connection between the ground combat robot and the ground can be cut off, so that the radio silence is kept.
The individual weapon image sighting device is selected and installed, so that remote control aiming of the weapon can be realized, and the shooting function of 'no head exposure' is realized.
The position and navigation information of the control hand individual soldier and the ground operation robot can be encrypted to resist interference, uploaded to a continuous operation command system and received information such as operation commands and battlefield situations transmitted by the control hand individual soldier and the ground operation robot.

Claims (9)

1. AR individual soldier and robot amalgamation system of fighting based on digit twin, its characterized in that includes:
the ground combat robot can accompany marching and can go forward to reconnaissance, combat or execute other tasks;
the AR display device can construct an AR three-dimensional virtual space, fuse virtual and real states to display a real battlefield environment, personnel equipment and a virtual target, perform interactive operation on the virtual 3D ground combat robot, display an AR electronic sand table and display battlefield situation;
the wearable device is in real-time communication with the ground combat robot through a data chain, receives the operation parameters of the ground combat robot and is connected with the AR display device; the wearable device can perform navigation positioning on the control hand and perform real-time plotting on an AR electronic sand table; and
the AR application system can rapidly construct the ground combat robot and the real battlefield environment into an AR three-dimensional virtual space, realizes that the 3D ground combat robot and the real ground combat robot share the same space-time, has a virtual-real mapping relation, can carry out digital twin type mixed combat of AR individual soldiers and the ground combat robot, and realizes the function of a 'leader sheep' of the 3D ground combat robot to a single or a plurality of real ground combat robots.
2. The digital twin-based AR individual soldier and robot mixed combat system according to claim 1, wherein the ground combat robot comprises an unmanned vehicle, an autopilot, a vehicle-mounted weapon, a photoelectric turret, a servo control system and a data chain vehicle-mounted terminal, and the data chain vehicle-mounted terminal is respectively connected with the autopilot, the vehicle-mounted weapon, the photoelectric turret and the servo system of the ground combat robot through standard interfaces.
3. The digital twin-based AR individual soldier and robot mixed combat system according to claim 2, wherein in an AR three-dimensional virtual space constructed by the AR display device, a manipulator can interactively operate a photoelectric turret and an autopilot of a 3D ground combat robot to perform photoelectric reconnaissance, tracking aiming, vehicle movement and weapon shooting; the real ground operation robot and the 3D ground operation robot have a mapping relation, can copy the action of the 3D ground operation robot, execute marching, reconnaissance, mine clearance, explosion clearance, rescue and accurate shooting operation tasks, and are used for the control skill training, operation scheme deduction and operation command training of the ground operation robot.
4. The digital twin-based AR individual and robot orchestration combat system according to claim 2, wherein the wearable device can be inserted into left/right arm pockets of a camouflage uniform or side pockets of a tactical backpack; the wearing apparatus includes:
the individual navigation positioning system is used for navigating and positioning the control hand, the data of the individual navigation positioning system is automatically positioned on the AR electronic sand table, and the position of the control hand is displayed;
the data link individual terminal realizes the real-time communication of remote control, remote measurement information and scout images and is respectively connected with the individual navigation positioning system and the AR glasses through a standard interface; data transmission is carried out between the data chain individual terminal and the data chain vehicle-mounted terminal; and
and the power supply module is used for providing electric energy.
5. The AR individual soldier and robot mixed combat system based on the digital twin as claimed in claim 4, wherein the position and speed parameters of the control hand can drive the real ground combat robot to go forward for reconnaissance and attack, and the distance and the orientation between the control hand and the real ground combat robot are randomly generated within a certain range each time.
6. The digital twin-based AR individual and robot orchestration combat system according to claim 1, wherein the AR display device is AR glasses; the AR glasses comprise an AR optical module, a microprocessor, an image acquisition module, a sensor, a communication module and a structural component;
the microprocessor is used for constructing and controlling an AR three-dimensional virtual space and a 3D ground combat robot;
the image acquisition module is used for processing depth-of-field images and recognizing gestures;
the sensors comprise a head tracking sensor, a hand tracking sensor, an eye tracking sensor, an iris distance measuring sensor, a voice AI chip and a video AI chip;
the communication module comprises a wifi communication module and a Bluetooth communication module;
the structural component is used for installing and positioning the AR optical module, the microprocessor, the image acquisition module, the sensor and the communication module, and is integrated on an individual soldier goggles or a helmet directly.
7. The AR individual soldier and robot mixed combat system based on digital twins as claimed in claim 6, wherein the AR glasses can construct a virtual space with immersion sense, can perform virtual-real fused space mapping, can rapidly construct an AR electronic sand table, can perform manual tracking, eye tracking and head tracking, and can operate and control a 3D ground combat robot and a photoelectric turret and a vehicle-mounted weapon thereof in a human body natural interaction manner.
8. The digital twin-based AR individual and robot orchestration combat system according to claim 6, wherein the AR application system comprises:
AR management software for system login, task management, virtual screen and space management, model management and database management;
the AR electronic sand table software adopts a two-dimensional or three-dimensional electronic sand table form, can display, plot and calculate the two-dimensional or three-dimensional battlefield situation of the electronic sand table, and can plan the route of the ground combat robot;
the reconnaissance and aiming software is used for displaying a reconnaissance and aiming image on a virtual screen of the AR glasses, and the reconnaissance image and the aiming division can drive a photoelectric turret of the real ground combat robot to rotate by acquiring parameters of the head tracking sensor, reconnaissance and discrimination of a suspicious target of the automatic image identification mark and driving the aiming division to aim at the target and shoot;
the 3D ground operation robot control software is used for training the operation skill of the ground operation robot, deducing an operation scheme and performing man-machine hybrid programming operation; and
on one hand, the AR digital twin software is that the position, the posture and the speed parameters of the real ground combat robot can drive the 3D ground combat robot on the AR electronic sand table to move synchronously, so that the real-time battlefield situation display is realized; or on the one hand, the 3D ground operation robot of the three-dimensional electronic sand table is controlled to move through gesture recognition, the head gesture of the control hand is sensed to control the rotation of the 3D virtual photoelectric rotating tower, so that the 3D ground operation robot and the motion parameters of the 3D virtual photoelectric rotating tower drive the motion, photoelectric reconnaissance and fire application of the real ground operation robot, and the virtual-real mapping relation between the 3D ground operation robot and the real ground operation robot is realized.
9. Method for digital twin based AR individual and robot mixed combat system, characterized in that the method is based on the digital twin based AR individual and robot mixed combat system according to any of the claims 1 to 8, comprising the following steps:
s1, wearing augmented reality AR glasses by a control hand, logging in and starting a task, and quickly constructing a three-dimensional electronic sand table according to data of a digital map;
s2, setting the position of the ground fighter robot through gesture recognition or voice control, and planning the route of the ground fighter robot;
s3, high-speed transmission between the data chain vehicle-mounted terminal and the data chain individual terminal: the system comprises the following components, namely position, posture and speed parameters of a ground combat robot, visible light and infrared reconnaissance images and aiming division images of a photoelectric turret, control data of an automatic pilot of a 3D ground combat robot in an AR electronic sand table and control data of a servo system of the 3D ground combat robot;
s4, enabling AR glasses to receive position, posture and speed parameters of a real ground operation robot of the data chain, and synchronously driving and plotting the motion of the 3D ground operation robot in the AR electronic sand table through data driving to realize real-time battlefield situation display;
s5, the AR glasses receive the scouting and aiming images of the real ground combat robot of the data chain, and the scouting and aiming images are displayed on a virtual screen of the AR glasses for a control hand to scout a target and aim at for shooting;
s6, when the AR glasses watch the reconnaissance image and aim at the division, reconnaissance and discrimination are carried out on the suspicious target of the automatic image identification mark, and the aiming division is controlled to aim at the target;
s7, operating the control hand in a virtual space through gesture recognition to control virtual operation tasks of the virtual reality 3D ground operation robot, transmitting operation parameters of the 3D ground operation robot to the real ground operation robot, and controlling the real ground operation robot to move along with the 3D ground operation robot;
the head posture of the control hand controls the rotation of the 3D virtual photoelectric turret, so that the 3D ground combat robot and the motion parameters of the 3D virtual photoelectric turret drive the motion and photoelectric reconnaissance of the real ground combat robot.
CN202010910620.4A 2020-09-02 2020-09-02 AR individual soldier and robot mixed-editing combat system and method based on digital twin Pending CN112150635A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010910620.4A CN112150635A (en) 2020-09-02 2020-09-02 AR individual soldier and robot mixed-editing combat system and method based on digital twin

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010910620.4A CN112150635A (en) 2020-09-02 2020-09-02 AR individual soldier and robot mixed-editing combat system and method based on digital twin

Publications (1)

Publication Number Publication Date
CN112150635A true CN112150635A (en) 2020-12-29

Family

ID=73890620

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010910620.4A Pending CN112150635A (en) 2020-09-02 2020-09-02 AR individual soldier and robot mixed-editing combat system and method based on digital twin

Country Status (1)

Country Link
CN (1) CN112150635A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112975982A (en) * 2021-03-16 2021-06-18 北京理工大学 Air-ground cooperative multi-robot system based on brain-computer fusion
CN114527425A (en) * 2022-02-24 2022-05-24 杨邦会 Mine personnel positioning method based on digital twinning
CN114815654A (en) * 2022-03-01 2022-07-29 北京理工大学 Unmanned vehicle control-oriented digital twin system and construction method thereof
CN115296682A (en) * 2022-07-13 2022-11-04 华东计算技术研究所(中国电子科技集团公司第三十二研究所) Multifunctional intelligent handheld system and method applied to individual combat
CN116476100A (en) * 2023-06-19 2023-07-25 兰州空间技术物理研究所 Remote operation system of multi-branch space robot
CN116740293A (en) * 2023-06-13 2023-09-12 西安速度时空大数据科技有限公司 Digital twinning-based three-dimensional terrain model acquisition method, device and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101343860B1 (en) * 2013-01-03 2013-12-20 재단법인대구경북과학기술원 Robot avatar system using hybrid interface and command server, learning server, and sensory server therefor
US20150287340A1 (en) * 2012-05-25 2015-10-08 BAE Systems Hägglunds Aktiebolag Method and system for evaluation of ground combat vehicle manoeuvring
CN107193371A (en) * 2017-04-28 2017-09-22 上海交通大学 A kind of real time human-machine interaction system and method based on virtual reality
CN108257145A (en) * 2017-12-13 2018-07-06 北京华航无线电测量研究所 A kind of UAV Intelligent based on AR technologies scouts processing system and method
CN108415460A (en) * 2018-03-29 2018-08-17 北京航空航天大学 A kind of combination separate type rotor and sufficient formula moving operation machine people concentration-distributed control method
CN108646589A (en) * 2018-07-11 2018-10-12 北京晶品镜像科技有限公司 A kind of battle simulation training system and method for the formation of attack unmanned plane
US10152771B1 (en) * 2017-07-31 2018-12-11 SZ DJI Technology Co., Ltd. Correction of motion-based inaccuracy in point clouds

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150287340A1 (en) * 2012-05-25 2015-10-08 BAE Systems Hägglunds Aktiebolag Method and system for evaluation of ground combat vehicle manoeuvring
KR101343860B1 (en) * 2013-01-03 2013-12-20 재단법인대구경북과학기술원 Robot avatar system using hybrid interface and command server, learning server, and sensory server therefor
CN107193371A (en) * 2017-04-28 2017-09-22 上海交通大学 A kind of real time human-machine interaction system and method based on virtual reality
US10152771B1 (en) * 2017-07-31 2018-12-11 SZ DJI Technology Co., Ltd. Correction of motion-based inaccuracy in point clouds
CN108257145A (en) * 2017-12-13 2018-07-06 北京华航无线电测量研究所 A kind of UAV Intelligent based on AR technologies scouts processing system and method
CN108415460A (en) * 2018-03-29 2018-08-17 北京航空航天大学 A kind of combination separate type rotor and sufficient formula moving operation machine people concentration-distributed control method
CN108646589A (en) * 2018-07-11 2018-10-12 北京晶品镜像科技有限公司 A kind of battle simulation training system and method for the formation of attack unmanned plane

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112975982A (en) * 2021-03-16 2021-06-18 北京理工大学 Air-ground cooperative multi-robot system based on brain-computer fusion
CN114527425A (en) * 2022-02-24 2022-05-24 杨邦会 Mine personnel positioning method based on digital twinning
CN114527425B (en) * 2022-02-24 2023-01-10 中国科学院空天信息创新研究院 Mine personnel positioning method based on digital twinning
CN114815654A (en) * 2022-03-01 2022-07-29 北京理工大学 Unmanned vehicle control-oriented digital twin system and construction method thereof
CN114815654B (en) * 2022-03-01 2023-02-24 北京理工大学 Unmanned vehicle control-oriented digital twin system and construction method thereof
CN115296682A (en) * 2022-07-13 2022-11-04 华东计算技术研究所(中国电子科技集团公司第三十二研究所) Multifunctional intelligent handheld system and method applied to individual combat
CN116740293A (en) * 2023-06-13 2023-09-12 西安速度时空大数据科技有限公司 Digital twinning-based three-dimensional terrain model acquisition method, device and storage medium
CN116476100A (en) * 2023-06-19 2023-07-25 兰州空间技术物理研究所 Remote operation system of multi-branch space robot

Similar Documents

Publication Publication Date Title
CN112150635A (en) AR individual soldier and robot mixed-editing combat system and method based on digital twin
CN113885208B (en) Display device, display system, and control method of display device
US10599142B2 (en) Display device and control method for display device
US20230244299A1 (en) Mixed reality high-simulation battlefield first aid training platform and training method using same
US7548697B2 (en) Method and device for controlling a remote vehicle
JP2021535353A (en) Display system for observation optics
US7843431B2 (en) Control system for a remote vehicle
US10030931B1 (en) Head mounted display-based training tool
WO2005079352A2 (en) Weapon ball stock with integrated weapon orientation
CN115158661A (en) Apparatus, system and method for unmanned aerial vehicle
US10310502B2 (en) Head-mounted display device, control method therefor, and computer program
CN110045745A (en) It is a kind of for controlling the wearable device and UAV system of unmanned plane
CN107140209A (en) A kind of unmanned plane targeted system
JP2018165066A (en) Head mounted display and method for controlling the same
US20230097676A1 (en) Tactical advanced robotic engagement system
CN112099620A (en) Combat collaboration system and method for soldier and team combat
KR20170090888A (en) Apparatus for unmanned aerial vehicle controlling using head mounted display
US20190355179A1 (en) Telepresence
US11804052B2 (en) Method for setting target flight path of aircraft, target flight path setting system, and program for setting target flight path
CN210512849U (en) A automatic rifle of 95 formulas of simulation for real soldier's virtual reality training
CN108145688A (en) A kind of public safety mobile-robot system and public safety mobile robot
CN108167793A (en) Intelligent lamp and lamp control method
Gromov et al. Intuitive 3D control of a quadrotor in user proximity with pointing gestures
US20210335145A1 (en) Augmented Reality Training Systems and Methods
ES2961614T3 (en) Intelligent system for controlling functions in a turret of a combat vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination