CN214148982U - Target-shooting robot countermeasure system based on binocular recognition - Google Patents
Target-shooting robot countermeasure system based on binocular recognition Download PDFInfo
- Publication number
- CN214148982U CN214148982U CN202023224665.0U CN202023224665U CN214148982U CN 214148982 U CN214148982 U CN 214148982U CN 202023224665 U CN202023224665 U CN 202023224665U CN 214148982 U CN214148982 U CN 214148982U
- Authority
- CN
- China
- Prior art keywords
- control panel
- binocular
- robot
- target
- shooting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Landscapes
- Traffic Control Systems (AREA)
- Navigation (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The utility model discloses a target practice robot confrontation system based on binocular recognition, wherein a service layer comprises a PC end software system; the system core layer comprises a GNSS base station, a GNSS mobile station, a micro control unit and a navigation system; the execution layer comprises a bottom control panel, a laser simulation weapon system comprising an infrared transmitter, an infrared wireless receiving ring target and an infrared receiver, an emergency braking system comprising an ultrasonic device, a radar, an anti-collision strip and an emergency stop switch, a wireless communication system comprising a wireless transmission module and a remote controller, a binocular identification and positioning system comprising a binocular high-speed camera and a marking assembly, a power system comprising a motor driver and a motor, and a battery management system comprising a power supply and a coulometer. The utility model discloses a shooting robot confrontation system has realized intelligent confrontation shooting, has dodged shooting and other functions, can be so that the training with the shooting robot is close the actual combat further, is favorable to promoting soldier's training efficiency.
Description
Technical Field
The utility model relates to a target practice robot system field.
Background
In order to continuously improve military literacy of soldiers, shooting training is a common training subject of special fighters, along with the improvement of the efficiency and the safety of training items, in order to improve some defects of the traditional target shooting series, at present, a target shooting robot is researched and developed aiming at some fixed target positions in China, and the target shooting robot has the advantages of unpredictable moving target positions, automatic target reporting and the like, and the training efficiency of soldiers is greatly improved. Although the training effect of soldiers can be better improved by replacing the traditional target position training with the target practice robot, the target practice robot still has the defects of breaking away from actual combat, having no functions of intelligently avoiding and simulating human behaviors and the like. In view of this, the present case utility model people develops, improves to current target practice robot, through test experiment training many times, tunnel the present case and produce.
SUMMERY OF THE UTILITY MODEL
The utility model discloses an aim at exists to present robot of shooing breaks away from the actual combat, does not possess intelligence and avoids, and shortcoming such as the function of simulation human action and propose a robot of shooing confrontation system based on two mesh discernments.
In order to achieve the above purpose, the technical scheme of the utility model is that: a target-shooting robot confrontation system based on binocular recognition is characterized by comprising a business layer, a system core layer and an execution layer;
the service layer comprises a PC end software system; the PC end software system is communicated with a system core layer and an execution layer through a wireless module;
the system core layer comprises a GNSS base station, a GNSS mobile station, a micro control unit and a navigation system, wherein the GNSS mobile station and the micro control unit are connected and communicated with the navigation system, and the GNSS base station and the GNSS mobile station are connected and communicated through a wireless module;
the executive layer comprises a bottom layer control panel, a laser simulation weapon system, an emergency braking system, a wireless communication system, a binocular recognition and positioning system, a power system and a battery management system, the laser simulation weapon system comprises an infrared transmitter and an infrared wireless receiving ring target which are connected with the bottom layer control panel, and an infrared receiver which is worn on a trainee body, the emergency braking system comprises an ultrasonic device, a radar, an anti-collision strip and an emergency stop switch which are connected with the bottom layer control panel, the wireless communication system comprises a wireless transmission module and a remote controller which are connected with the bottom layer control panel, the binocular recognition and positioning system comprises a binocular high-speed camera which is connected with the bottom layer control panel and a marking assembly which is worn on the trainee body, the power system comprises a motor driver which is connected with the bottom layer control panel and a motor which is connected with the motor driver, and the battery management system comprises a power supply which is connected with the bottom layer control panel and (ii) a The navigation system is connected and communicated with the bottom control panel.
Laser simulation weapon system is still including the display module and the voice broadcast module of being connected with the bottom control panel, display module output terminal is the LED display screen, the output terminal of voice broadcast module is the megaphone.
The countermeasure system of the target practice robot based on binocular recognition is characterized in that the target practice countermeasure working method is that the battery management system supplies power to the whole target practice robot, battery use states are detected in real time through a coulometer to obtain battery state information, and the battery state information is sent to terminal equipment through a wireless module;
the binocular recognition positioning system calculates position and distance information which takes a marking component on the body of the trainee as a base point, the position and distance information is current position information, the upcoming movement track position information of the trainee is predicted, the upcoming movement track position information is predicted position information, and the current position information and the predicted position information are sent to a laser simulation weapon system and a navigation system through a bottom control panel;
the laser simulated weapon system starts a laser transmitter to carry out countermeasure according to the current position information, and transmits an infrared signal beam with an encryption function to the current position of a trainee, and during the countermeasure, the infrared wireless receiving target and an infrared receiver on the body of the trainee detect whether a received infrared signal exists in a corresponding receiving area in real time, if the received infrared signal exists, the detected infrared signal is analyzed and decoded, the actual coordinate of an infrared signal drop point is calculated, and the actual coordinate is simultaneously transmitted to a display module and a voice broadcasting module to be displayed and broadcasted;
the navigation system compares the current position information and the predicted position information with the position information of the target practice robot, calculates a navigation path for avoiding a trainee by using an obstacle avoidance algorithm plan, obtains waypoint position information and navigation path information, sends the waypoint position information and the navigation path information to the bottom control panel, the bottom control panel calculates the waypoint position information and the navigation path information through a power system to obtain driving information of a motor which drives the target practice robot to move to a waypoint position and sends the driving information to a motor driver, and the motor driver controls the motor which drives the target practice robot to move to work; meanwhile, the navigation system obtains the position sensing information of the ultrasonic device and the radar of the emergency braking system and fuses the position sensing information into the planning calculation of the navigation path, and the automatic positioning and the posture adjustment of the target practice robot are continuously carried out;
the emergency braking system can perform emergency braking treatment when the anti-collision strip is collided or can perform emergency braking treatment manually through an emergency switch.
The control flow method of the binocular recognition positioning system comprises the following steps:
before the trainees use, calibrating through a binocular high-speed camera, completely placing the trainees in a shooting area, wearing a marking component at a characteristic position determined by the trainees, acquiring and shooting images of the trainees through the binocular high-speed camera to be images for marking, detecting the marking component on the images for marking by using an image processing algorithm to obtain a marking detection result of the marking component, then calculating a space coordinate of the marking component in a three-dimensional world coordinate, and finishing calibration;
when being trained, the marking component is worn on the obvious position of the trainee, and the use is carried out through the following processes,
firstly, acquiring a stereo image video, acquiring, shooting and recording a video image as a current video image through a binocular high-speed camera in the moving process of the target practice robot,
then, obtaining corresponding matching points in two frames of the video, adopting feature point matching to obtain corresponding matching points in two frames before and after the video in the current video image,
then, calculating the displacement of the camera by coordinate change of the matching points in the imaging space or establishing three-dimensional coordinates to obtain a camera displacement value,
then, binocular vision positioning is carried out to obtain the position and the rotation angle of the binocular high-speed camera at each moment in the moving process, the moving route of the binocular high-speed camera in the whole process is obtained by combining Kalman filtering, real-time binocular vision positioning is carried out to the target robot,
then, the data obtained by processing the current video image is calculated to obtain information data required for calculating the current position information and the predicted position information of the trainee, and the information data is sent to a central processing module for processing and calculating to obtain the current position information and the predicted position of the trainee.
By adopting the technical scheme, the beneficial effects of the utility model are that: the system structure of the target-shooting robot confrontation system forms the target-shooting robot confrontation system capable of avoiding confrontation, the system identifies and positions the trainee by utilizing a binocular identification and positioning system, realizes the countertraining with the trainee by a laser simulated weapon system, the trajectory planning of the evading trainee is carried out through the navigation system, the action of the target robot for evading trainee is realized through planning the motor driver of the hole group according to the trajectory through the power system, thus, the target practice robot using the target practice robot confrontation system of the utility model realizes intelligent confrontation shooting, and also has the function of avoiding shooting and other intelligent functions, which are the function and effect that are not realized in the prior target practice robot, by realizing the functions, the training of the target robot is further close to actual combat, and the training efficiency of soldiers is improved.
Use the utility model discloses a target robot of target practice robot countermeasure system when meetting urgent circumstances, the urgent braking of several kinds of systems of accessible, operating personnel accessible battery management system and PC end look over each item parameter index of target practice robot, observe the situation of use of target practice robot at any time, in addition the utility model discloses a target practice robot also can control through wireless remote control equipment, reaches the multiple demand that the target practice robot used.
Drawings
Fig. 1 is a schematic block diagram of a countermeasure system according to the present invention;
fig. 2 is a control flow chart of the binocular identification positioning system according to the present invention;
fig. 3 is a control flow chart of the laser simulated weapon system according to the present invention;
fig. 4 is a navigation control flowchart of the targeting robot according to the present invention.
Detailed Description
In order to further explain the technical solution of the present invention, the present invention is explained in detail by the following embodiments.
The utility model discloses a target robot countermeasures system based on two mesh discernments, as shown in fig. 1, fig. 2, fig. 3 and fig. 4, including business layer, system core layer and execution layer, the difference of using and target robot and current target robot lies in having realized more intelligent functional effect, and its control system's system work structure has constituted the countermeasures system that realizes can confront with the trainee and avoid the functional effect of trainee, and the system architecture (as shown in fig. 1) and the control process that combine this system of figure specification below.
The business layer comprises a PC end software system 11, and the PC end software system 11 is communicated with the system core layer and the execution layer through a wireless module;
the core layer of the system comprises a GNSS base station 21, a GNSS mobile station 22, a micro control unit 23 and a navigation system 24, wherein the GNSS mobile station 21 and the micro control unit 23(MCU) are connected and communicated with the navigation system 24, the GNSS base station 21 and the GNSS mobile station 22 are connected and communicated through a wireless module, and the GNSS is a global navigation satellite system;
the executive layer comprises a bottom control panel 31, a laser simulated weapon system 32, an emergency braking system 33, a wireless communication system 34, a binocular recognition positioning system 35, a power system 36 and a battery management system 37; the laser simulated weapon system 32 comprises an infrared transmitter 321 and an infrared wireless receiving ring-reporting target 322 which are connected with the bottom layer control board 31, and an infrared receiver 323 which is worn on a trainee, wherein the infrared wireless receiving ring-reporting target 322 or the ring-reporting target can report a ring automatically, the use is simple, the ring-reporting precision is high, the laser simulated weapon system 32 further comprises a display module 324 and a voice broadcasting module 325 which are connected with the bottom layer control board 31, and the output terminal of the display module 324 can be an LED display screen; the output terminal of the voice broadcasting module 325 may be a loudspeaker, and the emergency braking system 33 includes an ultrasonic device 331, a radar 332, an anti-collision bar 333, and an emergency stop switch 334 connected to the bottom control board 31, so as to detect obstacles and perform emergency braking processing during collision; the wireless communication system 34 comprises a wireless transmission module 341 and a remote controller 342 connected with the bottom control panel 31, and can realize wireless transmission and remote control operation of the targeting robot; the binocular recognition positioning system 35 comprises a binocular high-speed camera 351 connected with the bottom control panel 31 and a marking component 352 worn on a trainee, the power system 36 comprises a motor driver 361 connected with the bottom control panel 31 and a motor 362 connected with the motor driver 361, the corresponding number of the motor drivers 361 and the motors 362 is set according to the requirement of the targeting robot, such as a servo motor for controlling a chassis of a trolley, a motor for controlling the lifting and adjusting height of a target body on a control body and the like, the battery management system 37 comprises a power supply 371 connected with the bottom control panel 31 and a coulometer 372 connected with the power supply 371, and the service condition of the power supply can be detected in real time; the navigation system 24 is in communication with the underlying control panel 31.
It can be seen from above-mentioned structure that the utility model discloses a basic binocular discernment's robot of shooting confrontation system comprises three framework layer, as shown in fig. 1, the executive layer, system core layer and business layer, the executive layer is located the bottommost of whole framework, it mainly comprises above-mentioned binocular discernment positioning system, driving system, battery management system, laser simulation weapon system, emergency braking system and this six parts of wireless communication system, of course this layer still can include other system settings, other system settings are not the main improvement part of present case and specifically prescribe other system settings here, wherein the core component of bottom control panel for the executive layer, undertake upper strata (system core layer) data analysis and execution, data settlement and upload, tasks such as power system control command issue and motor encoder data acquisition. The system core layer mainly comprises a navigation system, a GNSS base station, a GNSS mobile station, an MCU and the like, and mainly undertakes tasks such as sensor data fusion, positioning calculation, path planning, motion control, data transfer and the like. In addition, the business layer is a control layer and mainly comprises terminal equipment, such as a PC (personal computer) end software system, a mobile end and the like, and is mainly used for setting a training task, monitoring the state of the robot, monitoring data of a bottom sensor, storing and analyzing a training result and the like.
The following describes the control flow of each part system in a target practice robot countermeasure system based on binocular recognition.
The battery management system supplies power to the whole target practice robot, detects the battery use state in real time through the coulometer to obtain the battery state information, and sends the battery state information to the terminal equipment through the wireless module.
The binocular recognition positioning system calculates position and distance information which takes a marking component on the body of the trainee as a base point, the position and distance information is current position information, the upcoming movement track position information of the trainee is predicted, the upcoming movement track position information is predicted position information, and the current position information and the predicted position information are sent to a laser simulation weapon system and a navigation system through a bottom control panel; the control flow method of the binocular recognition positioning system comprises the following steps:
before the trainees use, calibrating through a binocular high-speed camera, completely placing the trainees in a shooting area, wearing a marking component at a characteristic position determined by the trainees, acquiring and shooting images of the trainees through the binocular high-speed camera to be images for marking, detecting the marking component on the images for marking by using an image processing algorithm to obtain a marking detection result of the marking component, then calculating a space coordinate of the marking component in a three-dimensional world coordinate, and finishing calibration;
when being trained, the marking component is worn on the characteristic position determined by the trainee, and the use is carried out through the following processes,
firstly, acquiring a stereo image video, acquiring, shooting and recording a video image as a current video image through a binocular high-speed camera in the moving process of the target practice robot,
then, obtaining corresponding matching points in two frames of the video, adopting feature point matching to obtain corresponding matching points in two frames before and after the video in the current video image,
then, calculating the displacement of the camera by coordinate change of the matching points in the imaging space or establishing three-dimensional coordinates to obtain a camera displacement value,
then, binocular vision positioning is carried out to obtain the position and the rotation angle of the binocular high-speed camera at each moment in the moving process, the moving route of the binocular high-speed camera in the whole process is obtained by combining Kalman filtering, real-time binocular vision positioning is carried out to the target robot,
then, the data obtained by processing the current video image is calculated to obtain information data required for calculating the current position information and the predicted position information of the trainee, and the information data is sent to a central processing module for processing and calculating to obtain the current position information and the predicted position of the trainee.
The laser simulation weapon system starts a laser transmitter to carry out countermeasure according to current position information, and transmits an infrared signal beam with an encryption function to the current position of a trainee, and during the countermeasure, the infrared wireless receiving target and an infrared receiver on the body of the trainee detect whether a received infrared signal is in a corresponding receiving area in real time, if the received infrared signal is received, the detected infrared signal is analyzed and decoded, the actual coordinate of an infrared signal drop point is calculated, and the actual coordinate is transmitted to the display module and the voice broadcast module to be displayed and broadcast.
The navigation system compares the current position information and the predicted position information with the position information of the target practice robot, calculates a navigation path for avoiding a trainee by using an obstacle avoidance algorithm plan, obtains waypoint position information and navigation path information, sends the waypoint position information and the navigation path information to the bottom control panel, the bottom control panel calculates the waypoint position information and the navigation path information through a power system to obtain driving information of a motor which drives the target practice robot to move to a waypoint position and sends the driving information to a motor driver, and the motor driver controls the motor which drives the target practice robot to move to work; meanwhile, the navigation system obtains the position sensing information of the ultrasonic device and the radar of the emergency braking system and fuses the position sensing information into planning calculation of a navigation path, and the target-shooting robot is continuously and automatically positioned and adjusted in posture.
The emergency braking system can perform emergency braking treatment when the anti-collision strip is collided or can perform emergency braking treatment manually through an emergency switch.
Use the utility model discloses foretell a target robot of shooting robot countermeasure system based on two mesh discernments can realize multiple use mode, has more the trainee and has trained the rank and use the mode training that corresponds, and the use of shooting robot is more intelligent, and it is more accurate to train the higher efficiency, and the training is imitated and is promoted fast.
The above embodiments and drawings are not intended to limit the form and style of the present invention, and any suitable changes or modifications made by those skilled in the art should not be construed as departing from the scope of the present invention.
Claims (2)
1. A target-shooting robot confrontation system based on binocular recognition is characterized by comprising a business layer, a system core layer and an execution layer;
the service layer comprises a PC end software system; the PC end software system is communicated with a system core layer and an execution layer through a wireless module;
the system core layer comprises a GNSS base station, a GNSS mobile station, a micro control unit and a navigation system, wherein the GNSS mobile station and the micro control unit are connected and communicated with the navigation system, and the GNSS base station and the GNSS mobile station are connected and communicated through a wireless module;
the executive layer comprises a bottom layer control panel, a laser simulation weapon system, an emergency braking system, a wireless communication system, a binocular recognition and positioning system, a power system and a battery management system, the laser simulation weapon system comprises an infrared transmitter and an infrared wireless receiving ring target which are connected with the bottom layer control panel, and an infrared receiver which is worn on a trainee body, the emergency braking system comprises an ultrasonic device, a radar, an anti-collision strip and an emergency stop switch which are connected with the bottom layer control panel, the wireless communication system comprises a wireless transmission module and a remote controller which are connected with the bottom layer control panel, the binocular recognition and positioning system comprises a binocular high-speed camera which is connected with the bottom layer control panel and a marking assembly which is worn on the trainee body, the power system comprises a motor driver which is connected with the bottom layer control panel and a motor which is connected with the motor driver, and the battery management system comprises a power supply which is connected with the bottom layer control panel and a coulometer which is connected with the power supply (ii) a The navigation system is connected and communicated with the bottom control panel.
2. The binocular recognition-based target practice robot countermeasure system of claim 1, wherein the laser weapon simulation system further comprises a display module and a voice broadcast module connected with the bottom control board, the output terminal of the display module is an LED display screen, and the output terminal of the voice broadcast module is a loudspeaker.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202023224665.0U CN214148982U (en) | 2020-12-28 | 2020-12-28 | Target-shooting robot countermeasure system based on binocular recognition |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202023224665.0U CN214148982U (en) | 2020-12-28 | 2020-12-28 | Target-shooting robot countermeasure system based on binocular recognition |
Publications (1)
Publication Number | Publication Date |
---|---|
CN214148982U true CN214148982U (en) | 2021-09-07 |
Family
ID=77543236
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202023224665.0U Active CN214148982U (en) | 2020-12-28 | 2020-12-28 | Target-shooting robot countermeasure system based on binocular recognition |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN214148982U (en) |
-
2020
- 2020-12-28 CN CN202023224665.0U patent/CN214148982U/en active Active
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103353758B (en) | A kind of Indoor Robot navigation method | |
US20210311476A1 (en) | Patrol robot and patrol robot management system | |
CN105946853B (en) | The system and method for long range automatic parking based on Multi-sensor Fusion | |
EP2000777B1 (en) | Vehicle trajectory visualization system | |
CN109947119A (en) | A kind of autonomous system for tracking of mobile robot based on Multi-sensor Fusion and method | |
CN106527426A (en) | Indoor multi-target track planning system and method | |
KR101505411B1 (en) | Battle Game Relay System using Flying Robot | |
CN105182992A (en) | Unmanned aerial vehicle control method and device | |
US20190244536A1 (en) | Intelligent tactical engagement trainer | |
CN112461227B (en) | Wheel type chassis robot inspection intelligent autonomous navigation method | |
CN106291535A (en) | A kind of obstacle detector, robot and obstacle avoidance system | |
CN112665453A (en) | Target-shooting robot countermeasure system based on binocular recognition | |
CN106162144A (en) | A kind of visual pattern processing equipment, system and intelligent machine for overnight sight | |
CN104122891A (en) | Intelligent robot inspection system for city underground railway detection | |
CN112947550A (en) | Illegal aircraft striking method based on visual servo and robot | |
IL295508A (en) | Tactical advanced robotic engagement system | |
Schreiter et al. | The magni human motion dataset: Accurate, complex, multi-modal, natural, semantically-rich and contextualized | |
CN214148982U (en) | Target-shooting robot countermeasure system based on binocular recognition | |
CN112528699B (en) | Method and system for obtaining identification information of devices or users thereof in a scene | |
CN109901169A (en) | A kind of roadside parking space management system to be linked based on radar and rifle ball machine | |
CN110696003A (en) | Water side rescue robot based on SLAM technology and deep learning | |
CN113084776B (en) | Intelligent epidemic prevention robot and system based on vision and multi-sensor fusion | |
CN109542120A (en) | The method and device that target object is tracked by unmanned plane | |
CN112847374B (en) | Parabolic-object receiving robot system | |
US11577402B2 (en) | Robot system and portable teaching device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
GR01 | Patent grant |