CN111076612A - Intelligent unmanned vehicle lane warfare weapon station - Google Patents

Intelligent unmanned vehicle lane warfare weapon station Download PDF

Info

Publication number
CN111076612A
CN111076612A CN201911352908.8A CN201911352908A CN111076612A CN 111076612 A CN111076612 A CN 111076612A CN 201911352908 A CN201911352908 A CN 201911352908A CN 111076612 A CN111076612 A CN 111076612A
Authority
CN
China
Prior art keywords
unit
control unit
control
module
signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911352908.8A
Other languages
Chinese (zh)
Inventor
郭志华
蒋薇
罗静玲
李林森
沈虎
徐辉
胡园园
黄涛
朱华卫
孙菲
郑光迪
高楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Huazhong Tianqin Defense Technology Co ltd
Original Assignee
Wuhan Huazhong Tianqin Defense Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Huazhong Tianqin Defense Technology Co ltd filed Critical Wuhan Huazhong Tianqin Defense Technology Co ltd
Priority to CN201911352908.8A priority Critical patent/CN111076612A/en
Publication of CN111076612A publication Critical patent/CN111076612A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41HARMOUR; ARMOURED TURRETS; ARMOURED OR ARMED VEHICLES; MEANS OF ATTACK OR DEFENCE, e.g. CAMOUFLAGE, IN GENERAL
    • F41H11/00Defence installations; Defence devices

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Aiming, Guidance, Guns With A Light Source, Armor, Camouflage, And Targets (AREA)

Abstract

The invention relates to an intelligent unmanned vehicle fighting weapon station, which is arranged on an unmanned vehicle and comprises a photoelectric observing and aiming unit, an image processing unit, a comprehensive control unit, a weapon unit and a display control unit, wherein the photoelectric observing and aiming unit is used for acquiring and outputting image information of a target and a background; the image processing unit is used for outputting and feeding back a detection and identification result after detecting and identifying the acquired image information; the comprehensive control unit is used for receiving the image information and the control instruction and outputting a control signal according to the detection and identification result of the image information and the control instruction; the weapon unit is used for receiving the control signal and striking a target; the display control unit is used for displaying the image information and inputting a control instruction. The invention reduces the workload of the operator and the omission factor, and simultaneously improves the user experience and reduces the danger of the operator because the control of the operator is realized through wireless communication.

Description

Intelligent unmanned vehicle lane warfare weapon station
Technical Field
The invention relates to the field of military equipment, in particular to an intelligent unmanned vehicle roadway warfare station.
Background
In the current street battle, an executive personnel participating in anti-terrorism and anti-riot is mainly used for driving a vehicle to enter a designated area, a weapon station photoelectric observing and aiming unit is used for carrying out manual control and target searching, and after the target is searched, manual or automatic aiming is carried out to hit the confirmed target.
The photoelectric observing and aiming unit does not have the function of automatically identifying the target, the target needs to be searched by manpower, the urban environment background is complex, the workload of searching the target is large, and the target is easy to miss-check due to the fatigue of personnel. Meanwhile, the current urban battle has the characteristics of multiple dimensions, diversity, multiple rings and the like, so that workers participating in anti-terrorism and anti-riot work face great danger.
Disclosure of Invention
Aiming at the technical problems in the prior art, the invention provides an intelligent unmanned vehicle-road warfare station, which reduces the workload of operators, reduces the omission factor and reduces the danger of the operators.
The technical scheme for solving the technical problems is as follows:
an intelligent unmanned vehicle lane fighting weapon station, which is arranged on an unmanned vehicle and comprises a photoelectric observing and aiming unit, an image processing unit, a comprehensive control unit, a weapon unit and a display and control unit,
the photoelectric observing and aiming unit is in signal connection with the comprehensive control unit and is used for acquiring and outputting image information of a target and a background;
the image processing unit is in signal connection with the comprehensive control unit and the display control unit respectively and is used for outputting and feeding back a detection and identification result after detecting and identifying the acquired image information;
the comprehensive control unit is respectively connected with the photoelectric observing and aiming unit, the display control unit, the image processing unit and the weapon unit, and is used for receiving image information and a control instruction and outputting a control signal according to a detection and identification result of the image information and the control instruction;
the weapon unit is in signal connection with the comprehensive control unit and is used for receiving a control signal and striking a target;
the display control unit is in signal connection with the comprehensive control unit and the image processing unit respectively and is used for displaying image information and inputting a control instruction.
On the basis of the technical scheme, the invention can be further improved as follows.
Further, the photoelectric observing and sighting unit comprises an optical single machine and a photoelectric rotary table, wherein the optical single machine is fixedly arranged on the photoelectric rotary table and can rotate along with the photoelectric rotary table; the optical single machine comprises an image sensor, a laser range finder, an optical fiber gyroscope and a sensor control module, wherein the video output end of the image sensor is connected with the integrated control unit through an optical fiber, the control input end of the image sensor is connected with the control input end of the laser range finder respectively through an output end signal of the sensor control module, the input end of the sensor control module is connected with the integrated control unit through a signal, and the signal output end of the laser range finder is connected with the signal output end of the optical fiber gyroscope respectively through an integrated control unit through a signal.
Further, the photoelectric turntable comprises a driver, a motor and an angle measuring module, wherein the driver and the motor are fixedly arranged on a fixed part of the photoelectric turntable, a rotor of the motor is fixedly connected with a rotating part of the photoelectric turntable, and the angle measuring module is arranged on the rotating part of the photoelectric turntable; the input end of the driver is in signal connection with the comprehensive control unit, the output end of the driver is connected with the control signal input end of the motor, and the output end of the angle measuring module is in signal connection with the comprehensive control unit.
Furthermore, the integrated control unit comprises an optical fiber receiving module, a video transmission module, an integrated control module and a shooting control module, wherein the input end of the optical fiber receiving module is connected with the photoelectric observing and aiming unit through an optical fiber, the output end of the optical fiber receiving module is in signal connection with the input end of the image processing unit and the input end of the video transmission module respectively, and the optical fiber receiving module is used for receiving optical signals of image information and outputting the optical signals as electric signals; the output end of the video transmission module is in signal connection with the display control unit, and the video transmission module is used for transmitting the electric signal of the image information to the display control unit; the comprehensive control module is in signal connection with the photoelectric observing and aiming unit, the shooting control module and the display control unit respectively, and the comprehensive control module is used for receiving a control instruction of the display control unit, receiving a signal of the photoelectric observing and aiming unit and outputting a control signal to the shooting control module.
Furthermore, the display control unit is in signal connection with the image processing unit, the video transmission module and the comprehensive control module respectively through a wireless communication mode and is used for displaying image information and providing a man-machine interaction page.
Furthermore, the control input end of the weapon unit is in signal connection with the control output end of the shooting control module and is used for receiving the control signal of the shooting control module and striking a target.
Furthermore, the output end of the image processing unit is in signal connection with the display control unit in a wireless communication mode, and the image processing unit is used for calculating a target detection and identification result from the received image information electric signal through an artificial intelligence algorithm and outputting the target detection and identification result to the display control unit in real time; the output end of the image processing unit is also in signal connection with the video transmission module, and the image processing unit outputs the image on which the target detection identification information is superposed to the video transmission module.
Further, the signal output end of the image processing unit is also in signal connection with the comprehensive control module, and the image processing unit calculates the deviation amount of the target in the image through an image tracking algorithm and outputs the deviation amount to the comprehensive control module.
Further, the image processing unit calculates a target detection recognition result by a YOLO _ V3 algorithm.
Further, the image processing unit calculates the deviation amount of the target in the image through an openposition algorithm.
The invention has the beneficial effects that: the device of the invention acquires images of a target area at a plurality of angles through the photoelectric observing and aiming unit, utilizes an artificial intelligence algorithm in the image processing unit, acquires the characteristics of the target of a specified category for training, realizes the function of intelligent image detection and identification, can realize the automatic identification of the target by the weapon station device after loading the function into the image processing unit, and reports the detection and identification result to an operator at the display and control unit in real time, and the operator controls the device in a wireless communication mode, so that the identified target is hit through the weapon unit, the workload of the operator is reduced, the missed detection rate is reduced, and meanwhile, the control of the operator is realized through wireless communication, the user experience is improved, and the danger of the operator is reduced.
Drawings
FIG. 1 is a block diagram of the overall structure of the present invention;
fig. 2 is a block diagram of the overall structure of the invention.
In the drawings, the components represented by the respective reference numerals are listed below:
1. the device comprises a photoelectric observing and aiming unit, 11, an optical single machine, 111, an image sensor, 112, a laser range finder, 113, a fiber-optic gyroscope, 114, a sensor control module, 115, a fiber-optic sending module, 12, a photoelectric rotary table, 121, a driver, 122, a motor, 123, an angle measuring module, 2, an image processing unit, 3, a comprehensive control unit, 31, a comprehensive control module, 32, a fiber-optic receiving module, 33, a video recording transmission module, 34, a shooting control module, 4, a weapon unit, 5 and a display control unit.
Detailed Description
The principles and features of this invention are described below in conjunction with the following drawings, which are set forth by way of illustration only and are not intended to limit the scope of the invention.
An intelligent unmanned vehicle roadway weapon station as shown in fig. 1-2 is arranged on an unmanned vehicle, and comprises a photoelectric sighting unit 1, an image processing unit 2, a comprehensive control unit 3, a weapon unit 4 and a display control unit 5;
the photoelectric observing and aiming unit 1 is in signal connection with the comprehensive control unit 3 and is used for acquiring and outputting image information of a target and a background;
the image processing unit 2 is in signal connection with the integrated control unit 3 and the display control unit 5 respectively, and is used for outputting and feeding back a detection and identification result after detecting and identifying the acquired image information;
the comprehensive control unit 3 is connected with the photoelectric observing and aiming unit 1, the display control unit 5, the image processing unit 2 and the weapon unit 4 respectively, and is used for receiving image information and a control instruction and outputting a control signal according to a detection and identification result of the image information and the control instruction;
the weapon unit 4 is in signal connection with the comprehensive control unit 3 and is used for receiving a control signal and striking a target;
the display control unit 5 is in signal connection with the integrated control unit 3 and the image processing unit 2 respectively, and is used for displaying image information and inputting a control instruction.
On the basis of the technical scheme, the invention can be further improved as follows.
In this embodiment, the photoelectric observing and sighting unit 1 includes an optical single machine 11 and a photoelectric rotary table 12, where the optical single machine 11 is fixedly installed on the photoelectric rotary table 12 and can rotate with the photoelectric rotary table 12 to increase the range of image acquisition; the optical stand-alone unit 11 comprises an image sensor 111, a laser range finder 112, a fiber-optic gyroscope 113 and a sensor control module 114, wherein a fiber-optic transmitting module 115 is arranged at the video output end of the image sensor 111, and the output end of the fiber-optic transmitting module 115 is connected with the comprehensive control unit 3, so that an electric signal of image information is converted into an optical signal to be transmitted to the comprehensive control unit 3 for further processing. The control input end of the image sensor 111 and the control input end of the laser range finder 112 are respectively in signal connection with the output end of the sensor control module 114, the input end of the sensor control module 114 is in signal connection with the comprehensive control unit 3 through an RS422 serial port, the comprehensive control unit 3 outputs a control signal to the sensor control module 114, the sensor control module 114 respectively controls the image sensor 111 to acquire image information, and the laser range finder 112 respectively measures the distance of a target in a calculated image. The signal output end of the laser distance meter 112 and the signal output end of the fiber-optic gyroscope 113 are respectively in signal connection with the integrated control unit 3, so that target distance information measured by the laser distance meter 112 and angular velocity and rotation angle signals measured by the fiber-optic gyroscope 113 are transmitted to the integrated control unit 3 for further calculation.
In this embodiment, the photoelectric turntable 12 includes a driver 121, a motor 122, and an angle measuring module 123, where the driver 121 and the motor 122 are fixedly installed on a fixed portion of the photoelectric turntable 12 (i.e., on an unmanned vehicle carrying the weapon device of this embodiment), a rotor of the motor 122 is fixedly connected with a rotating portion of the photoelectric turntable 12, and the angle measuring module 123 is disposed on the rotating portion of the photoelectric turntable 12 and is used for monitoring a rotating angle of the photoelectric turntable 12 in real time; the input end of the driver 121 is in signal connection with the integrated control unit 3, the output end of the driver 121 is connected to the control signal input end of the motor 122, and is configured to convert a motor 122 control signal output by the integrated control unit 3 into a driving signal of the motor 122, so as to drive the motor 122 to drive the rotating portion of the photoelectric turntable 12 to rotate, and the output end of the angle measurement module 123 is in signal connection with the integrated control unit 3, and is configured to feed back the real-time monitored rotating angle of the photoelectric turntable 12 to the integrated control unit 3, and is configured to determine whether the rotating angle of the photoelectric turntable 12 reaches a target angle. The motor 122 is a servo motor, which can control the rotation angle well. The driver 121, the motor 122 and the angle measuring module 123 are combined to form a closed-loop servo control system, so that the rotation angle of the photoelectric turntable can be accurately adjusted.
In this embodiment, the integrated control unit 3 includes an optical fiber receiving module 32, a video transmission module 33, an integrated control module 31, and a shooting control module 34, where an input end of the optical fiber receiving module 32 is connected to an optical fiber sending module 115 on the photoelectric observing and aiming unit 1 through an optical fiber, an output end of the optical fiber receiving module 32 is in signal connection with an input end of the image processing unit 2 and an input end of the video transmission module 33, respectively, and the optical fiber receiving module 32 is configured to receive an optical signal of image information and output the optical signal as an electrical signal; the output end of the video transmission module 33 is in signal connection with the display and control unit 5 through a radio station, and the video transmission module 33 is used for transmitting an electric signal of image information to the display and control unit 5; the integrated control module 31 is in signal connection with the photoelectric sighting unit 1 and the shooting control module 34, specifically, the integrated control module 31 is in signal connection with the sensor control module 114, the fiber-optic gyroscope 113 and the shooting control module 34, and is in wireless communication connection with the display control unit 5 through a wireless radio station, the integrated control module 31 is used for receiving a control instruction input by an operator on the display control unit 5, receiving a signal of the photoelectric sighting unit 1, and outputting a control signal to the shooting control module 34, more specifically, receiving a signal of the fiber-optic gyroscope 113, and outputting a control signal to the sensor control module 114 and the shooting control module 34, and is used for controlling various sensors of the photoelectric sighting unit 1 to acquire information and outputting a control signal to the shooting control module 34 according to the control instruction, to strike the detected and identified target by the weapon unit 4.
In this embodiment, the display and control unit 5 is in wireless communication connection with the image processing unit 2, the video transmission module 33, and the integrated control module 31 through a wireless radio station, and is configured to display image information and provide a human-computer interaction page, and an operator inputs a control instruction and sets parameters of the device on the human-computer interaction page provided by the display and control unit 5.
In this embodiment, the control input end of the weapon unit 4 is in signal connection with the control output end of the shooting control module 34 through an I/O port, and is configured to receive the control signal of the shooting control module 34 and strike a target.
In this embodiment, the output end of the image processing unit 2 is in signal connection with the display and control unit 5 in a wireless communication manner, and the image processing unit 2 is configured to calculate a target detection and identification result from the received image information electrical signal through an artificial intelligence algorithm and output the target detection and identification result to the display and control unit 5 in real time; the output end of the image processing unit 2 is further connected with the video transmission module 33 through a video line, and the image processing unit 2 outputs the image on which the target detection identification information is superimposed to the video transmission module 33.
In this embodiment, the image processing unit 2 applies an artificial intelligence algorithm to detect and identify objects such as people, vehicles, and the like. Specifically, the signal output end of the image processing unit 2 is further connected to the integrated control module 31 through a communication network, and the image processing unit 2 calculates the deviation amount of the target in the image through an image tracking algorithm and outputs the deviation amount to the integrated control module 31.
In this embodiment, the image processing unit 2 calculates the target detection recognition result by using the YOLO _ V3 algorithm.
In this embodiment, the image processing unit 2 calculates the deviation amount of the target in the image through an openposition algorithm.
The working principle is as follows:
the operator selects a manual remote control or autonomous sector scanning command to output to the comprehensive control module 31 at the display control unit 5, and after the comprehensive control module 31 receives the command, the servo control system (i.e. the photoelectric rotary table 12 of the photoelectric observing and aiming unit 1) is used for realizing the motion control of the weapon platform and adjusting the rotation angle of the weapon station so as to expand the observing and aiming range.
The image sensor 111 for optical imaging converts the target and background reflected light signals or thermal radiation signals into electrical signals and inputs the electrical signals into the optical fiber transmitting module 115, the optical fiber transmitting module 115 converts the electrical signals into optical signals and outputs the optical signals to the optical fiber receiving module 32 through optical fibers, and the optical fiber receiving module 32 converts the optical signals into SDI video signals and outputs the SDI video signals to the video transmission module 33 and the image processing unit 2, respectively.
After receiving the input of the SDI video signal, the image processing unit 2 outputs the detection and identification results of the targets such as personnel, vehicles and the like to the display control unit 5 in real time by using an artificial intelligence algorithm; the image processing unit 2 outputs the image on which the target detection identification information is superimposed to the video transmission module 33, and at this time, the apparatus enters a tracking mode; when the device enters the tracking mode, the image processing unit 2 further outputs the deviation amount of the target in the image to the integrated control module 31 by using an image tracking algorithm, at this time, an operator can input a behavior recognition instruction in the display control unit 5, and after receiving the behavior recognition instruction, the image processing unit 2 also periodically outputs a target dangerous behavior to the display control unit 5.
After receiving the SDI video signal input by the optical fiber receiving module 32 and the network video signal input by the image processing unit 2, the video transmission module 33 starts to store the SDI video image on the one hand, and performs coding and compression on the network video signal on the other hand, and pushes the compressed video stream to the display control unit 5 through a radio station.
In the process of target detection and identification, the display control unit 5 can send a zoom instruction to the integrated control module 31 through the radio station, and after the integrated control module 31 receives the control instruction of the display control unit 5, the image sensor 111 is controlled by the sensor control module 114 to perform zoom control, so that the image sensor 111 outputs a clear image.
When the display control unit 5 confirms the dangerous target, a target tracking instruction can be output to the integrated control module 31 and the image processing module, after the integrated control module 31 and the image processing module receive the tracking instruction, the image processing module extracts a target deviation amount to the integrated control module 31 by using a target tracking algorithm, and after the integrated control module 31 receives the tracking instruction and the target deviation amount, the target is tracked by using a position closed loop.
When the display control unit 5 needs to perform ranging or positioning on a target, a laser ranging instruction is sent to the integrated control module 31, after the integrated control module 31 receives the ranging instruction, a ranging starting instruction is input to the laser range finder 112 through the sensor control board, after the laser range finder 112 receives the ranging instruction, ranging is started, and a fed-back ranging value is fed back to the display control unit 5 through the sensor control board, the integrated control board and the radio station.
When the display control unit 5 needs to hit the target, the display control unit 5 sends a hitting instruction to the integrated control module 31 through the radio station, and after the integrated control module 31 receives the instruction, the shooting control module 34 controls the weapon of the weapon unit 4 to fire.
The real-time performance and the detection recognition rate are comprehensively considered, and the artificial intelligence detection recognition algorithm adopts the YOLO _ V3 algorithm in the prior art. The method comprises the following specific steps:
when the acquired image is a visible light image, extracting a target area by using a YOLO _ V3 algorithm in combination with a detection result output by a target detection and recognition function of the image processing unit 2, performing attitude estimation on N target areas by using an openposition algorithm model, classifying attitude information output by the N areas by using an SVM classifier, and uploading a behavior pattern of a target; when the acquired image is an infrared image, the target area is extracted by using a YOLO _ V3 algorithm in combination with the detection result output by the target detection and identification function of the image processing unit 2, then the high-level features of the multi-frame image are associated, the associated features are classified, the behavior category of the target is determined, and the identification result is uploaded.
The device of the invention acquires images of a target area from a plurality of angles through the photoelectric observing and aiming unit 1, utilizes an artificial intelligence algorithm in the image processing unit 2, acquires the characteristics of the target of a specified category for training, realizes the function of intelligent image detection and identification, can realize the automatic identification of the target by the weapon station device after the function is loaded to the image processing unit 2, and reports the detection and identification result to an operator at the display and control unit 5 in real time, and the operator controls the device in a wireless communication mode, so that the identified target is hit through the weapon unit 4, the workload of the operator is reduced, the omission ratio is reduced, and meanwhile, the control of the operator is realized through wireless communication, the user experience is improved, and the danger of the operator is reduced.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (10)

1. An intelligent unmanned vehicle lane fighting weapon station is arranged on an unmanned vehicle and is characterized by comprising a photoelectric sighting unit (1), an image processing unit (2), a comprehensive control unit (3), a weapon unit (4) and a display control unit (5),
the photoelectric observing and aiming unit (1) is in signal connection with the comprehensive control unit (3) and is used for acquiring and outputting image information of a target and a background;
the image processing unit (2) is in signal connection with the comprehensive control unit (3) and the display control unit (5) respectively, and is used for outputting and feeding back a detection and identification result after detecting and identifying the acquired image information;
the comprehensive control unit (3) is respectively connected with the photoelectric observing and aiming unit (1), the display and control unit (5), the image processing unit (2) and the weapon unit (4) and is used for receiving image information and control instructions and outputting control signals according to the detection and identification results of the image information and the control instructions;
the weapon unit (4) is in signal connection with the comprehensive control unit (3) and is used for receiving a control signal and striking a target;
the display control unit (5) is in signal connection with the integrated control unit (3) and the image processing unit (2) respectively and is used for displaying image information and inputting control instructions.
2. The intelligent unmanned roadway station according to claim 1, wherein the photoelectric sighting unit (1) comprises an optical single machine (11), a photoelectric turntable (12), the optical single machine (11) is fixedly installed on the photoelectric turntable (12) and can rotate along with the photoelectric turntable (12); optical single machine (11) includes image sensor (111), laser range finder (112), fiber-optic gyroscope (113), sensor control module (114), the video output of image sensor (111) pass through optic fibre with integrated control unit (3) are connected, the control input of image sensor (111) the control input of laser range finder (112) respectively with the output end signal connection of sensor control module (114), the input of sensor control module (114) with integrated control unit (3) signal connection, the signal output part of laser range finder (112) the signal output part of fiber-optic gyroscope (113) respectively with integrated control unit (3) signal connection.
3. The intelligent unmanned roadway station as claimed in claim 2, wherein the optoelectronic turntable (12) comprises a driver (121), a motor (122) and an angle measurement module (123), the driver (121) and the motor (122) are fixedly mounted on a fixed portion of the optoelectronic turntable (12), a rotor of the motor (122) is fixedly connected with a rotating portion of the optoelectronic turntable (12), and the angle measurement module (123) is arranged on the rotating portion of the optoelectronic turntable (12); the input end of the driver (121) is in signal connection with the comprehensive control unit (3), the output end of the driver (121) is connected with the control signal input end of the motor (122), and the output end of the angle measuring module (123) is in signal connection with the comprehensive control unit (3).
4. The intelligent unmanned roadway station as claimed in claim 1, wherein the integrated control unit (3) comprises an optical fiber receiving module (32), a video transmission module (33), an integrated control module (31), and a shooting control module (34), an input end of the optical fiber receiving module (32) is connected with the photoelectric sighting unit (1) through an optical fiber, an output end of the optical fiber receiving module (32) is respectively connected with an input end of the image processing unit (2) and an input end of the video transmission module (33) through signals, and the optical fiber receiving module (32) is configured to receive an optical signal of image information and output the optical signal as an electrical signal; the output end of the video transmission module (33) is in signal connection with the display control unit (5), and the video transmission module (33) is used for transmitting the electric signal of the image information to the display control unit (5); the integrated control module (31) is in signal connection with the photoelectric observing and aiming unit (1), the shooting control module (34) and the display and control unit (5), and the integrated control module (31) is used for receiving a control instruction of the display and control unit (5), receiving a signal of the photoelectric observing and aiming unit (1) and outputting a control signal to the shooting control module (34).
5. The intelligent unmanned roadway station as claimed in claim 4, wherein the display and control unit (5) is in signal connection with the image processing unit (2), the video transmission module (33) and the integrated control module (31) in a wireless communication mode, and is configured to display image information and provide a human-computer interaction page.
6. The intelligent unmanned roadway warfare station as claimed in claim 4, wherein a control input of said weapon unit (4) is in signal connection with a control output of said firing control module (34) for receiving a control signal of said firing control module (34) and striking a target.
7. The intelligent unmanned roadway station as claimed in claim 4, wherein the output end of the image processing unit (2) is in signal connection with the display and control unit (5) through a wireless communication mode, and the image processing unit (2) is used for calculating a target detection and identification result from the received electrical image information signal through an artificial intelligence algorithm and outputting the target detection and identification result to the display and control unit (5) in real time; the output end of the image processing unit (2) is further in signal connection with the video transmission module (33), and the image processing unit (2) outputs the image on which the target detection identification information is superimposed to the video transmission module (33).
8. The intelligent unmanned roadway station as claimed in claim 7, wherein the signal output end of the image processing unit (2) is further connected with the integrated control module (31) through signals, and the image processing unit (2) calculates the deviation amount of the target in the image through an image tracking algorithm and outputs the deviation amount to the integrated control module (31).
9. The intelligent unmanned roadway warfare station as claimed in claim 7, wherein said image processing unit (2) calculates target detection recognition results by a YOLO _ V3 algorithm.
10. The intelligent unmanned roadway warfare station as claimed in claim 7, wherein the image processing unit (2) calculates the deviation amount of the target in the image by opencast algorithm.
CN201911352908.8A 2019-12-25 2019-12-25 Intelligent unmanned vehicle lane warfare weapon station Pending CN111076612A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911352908.8A CN111076612A (en) 2019-12-25 2019-12-25 Intelligent unmanned vehicle lane warfare weapon station

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911352908.8A CN111076612A (en) 2019-12-25 2019-12-25 Intelligent unmanned vehicle lane warfare weapon station

Publications (1)

Publication Number Publication Date
CN111076612A true CN111076612A (en) 2020-04-28

Family

ID=70317481

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911352908.8A Pending CN111076612A (en) 2019-12-25 2019-12-25 Intelligent unmanned vehicle lane warfare weapon station

Country Status (1)

Country Link
CN (1) CN111076612A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113904727A (en) * 2021-09-22 2022-01-07 北京电子工程总体研究所 Remote control system and method for display and control console of command and control vehicle
CN114294997A (en) * 2022-01-05 2022-04-08 武汉华中天勤防务技术有限公司 Weapon station base and weapon station carrying equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103940297A (en) * 2013-01-21 2014-07-23 上海市浦东新区知识产权保护协会 Unmanned reconnaissance weapon platform
CN204535572U (en) * 2015-04-13 2015-08-05 中国船舶重工集团公司第七一七研究所 Rws
CN107757919A (en) * 2017-10-26 2018-03-06 牟正芳 Armed drones' optronic fire control system and method
CN109099779A (en) * 2018-08-31 2018-12-28 江苏域盾成鹫科技装备制造有限公司 A kind of detecting of unmanned plane and intelligent intercept system
US20190033425A1 (en) * 2017-07-31 2019-01-31 United States Of America, As Represented By The Secretary Of The Army Weapon Fire Detection and Localization System for Electro-Optical Sensors

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103940297A (en) * 2013-01-21 2014-07-23 上海市浦东新区知识产权保护协会 Unmanned reconnaissance weapon platform
CN204535572U (en) * 2015-04-13 2015-08-05 中国船舶重工集团公司第七一七研究所 Rws
US20190033425A1 (en) * 2017-07-31 2019-01-31 United States Of America, As Represented By The Secretary Of The Army Weapon Fire Detection and Localization System for Electro-Optical Sensors
CN107757919A (en) * 2017-10-26 2018-03-06 牟正芳 Armed drones' optronic fire control system and method
CN109099779A (en) * 2018-08-31 2018-12-28 江苏域盾成鹫科技装备制造有限公司 A kind of detecting of unmanned plane and intelligent intercept system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113904727A (en) * 2021-09-22 2022-01-07 北京电子工程总体研究所 Remote control system and method for display and control console of command and control vehicle
CN114294997A (en) * 2022-01-05 2022-04-08 武汉华中天勤防务技术有限公司 Weapon station base and weapon station carrying equipment
CN114294997B (en) * 2022-01-05 2023-10-24 武汉华中天勤防务技术有限公司 Weapon station base and weapon station carrying equipment

Similar Documents

Publication Publication Date Title
CN108447075B (en) Unmanned aerial vehicle monitoring system and monitoring method thereof
CN105486175B (en) Low-altitude security protection system and method based on large-power continuous laser
CN101922894B (en) Anti-sniper laser active detection system and method
CN108258613B (en) Intelligent line inspection photoelectric pod and line inspection realization method
CN114115296B (en) Intelligent inspection and early warning system and method for key area
CN109737981B (en) Unmanned vehicle target searching device and method based on multiple sensors
CN104524731A (en) Multi-information fusion intelligent water monitor extinguishing system based on electric-optic turret
CN113877124B (en) Intelligent control system for jet flow falling point of fire monitor
CN111076612A (en) Intelligent unmanned vehicle lane warfare weapon station
CN108534603B (en) Laser ranging night vision sighting telescope, anti-unmanned aerial vehicle net capturing gun and use method thereof
CN104132586A (en) Automatic aiming system of firearms and operation method of automatic aiming system
CN116400738B (en) Low-cost striking method and system for low-speed unmanned aerial vehicle
CN102116596A (en) Method for judging targeting of simulated shooting for tank element training based on image analysis
KR20160123551A (en) System and method for controlling video information based automatic of the drone for the inspection of electric power facilities
CN108170139A (en) A kind of photoelectricity multitask system for unmanned boat and perform method
CN103278142B (en) Optoelectronic system-based continuous-tracking automatic-switching method
CN104049267A (en) Forest fire point positioning method based on GPS and microwave distance measurement
CN110434874A (en) A kind of explosion machine intelligent control system
CN106873626B (en) Passive positioning and searching system
CN206953020U (en) A kind of portable Cha Da robots
CN113848992A (en) Target detection location and automatic shooting system based on unmanned aerial vehicle and armed beating robot
CN105578410A (en) Novel unmanned plane
CN112665453A (en) Target-shooting robot countermeasure system based on binocular recognition
CN104111663A (en) Three-dimensional closed-loop feedback control method for automatic rocket inertia unit target prism collimation
CN109040557B (en) Law enforcement recorder for patrol and control method and working method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200428