CN111824406A - Public safety independently patrols four rotor unmanned aerial vehicle based on machine vision - Google Patents
Public safety independently patrols four rotor unmanned aerial vehicle based on machine vision Download PDFInfo
- Publication number
- CN111824406A CN111824406A CN202010691797.XA CN202010691797A CN111824406A CN 111824406 A CN111824406 A CN 111824406A CN 202010691797 A CN202010691797 A CN 202010691797A CN 111824406 A CN111824406 A CN 111824406A
- Authority
- CN
- China
- Prior art keywords
- image
- target
- data
- unmanned aerial
- aerial vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012545 processing Methods 0.000 claims abstract description 114
- 238000000034 method Methods 0.000 claims abstract description 44
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 claims abstract description 40
- 230000008569 process Effects 0.000 claims abstract description 14
- 238000007689 inspection Methods 0.000 claims abstract description 7
- 238000013135 deep learning Methods 0.000 claims description 26
- 230000000087 stabilizing effect Effects 0.000 claims description 13
- WHXSMMKQMYFTQS-UHFFFAOYSA-N Lithium Chemical compound [Li] WHXSMMKQMYFTQS-UHFFFAOYSA-N 0.000 claims description 11
- 229910052744 lithium Inorganic materials 0.000 claims description 11
- 230000003287 optical effect Effects 0.000 claims description 11
- 230000005540 biological transmission Effects 0.000 claims description 10
- 210000001061 forehead Anatomy 0.000 claims description 10
- 238000004891 communication Methods 0.000 claims description 8
- 238000012549 training Methods 0.000 claims description 7
- 230000003993 interaction Effects 0.000 claims description 6
- 238000013527 convolutional neural network Methods 0.000 claims description 5
- 239000002245 particle Substances 0.000 claims description 4
- 239000000725 suspension Substances 0.000 claims description 4
- 238000006243 chemical reaction Methods 0.000 claims description 2
- 238000001914 filtration Methods 0.000 claims description 2
- 230000036760 body temperature Effects 0.000 abstract description 11
- 230000002159 abnormal effect Effects 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 238000001514 detection method Methods 0.000 description 5
- 230000006399 behavior Effects 0.000 description 3
- 238000006073 displacement reaction Methods 0.000 description 3
- 206010063385 Intellectualisation Diseases 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000011105 stabilization Methods 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000013475 authorization Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C27/00—Rotorcraft; Rotors peculiar thereto
- B64C27/04—Helicopters
- B64C27/08—Helicopters with two or more rotors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/0022—Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
- G01J5/0025—Living bodies
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/12—Target-seeking control
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/217—Validation; Performance evaluation; Active pattern learning techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/20—Remote controls
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J2005/0077—Imaging
Abstract
The invention discloses a public safety autonomous inspection quad-rotor unmanned aerial vehicle based on machine vision, belonging to the field of aircrafts; the flight control system consists of a power system, a flight control system, a data processing system and a mobile terminal system. When the system is operated, a target model uploaded to a data processing system by a mobile terminal system analyzes and processes real-time image data obtained by an image acquisition module to obtain a target type in an image, when the image data is recognized as a human, the human density is calculated by utilizing the image data to judge whether a scene is a suspicious dangerous person cluster, then adjacent people are subjected to face data scanning one by one, the human body temperature data is obtained by an infrared thermal imager to judge whether the human body temperature is normal, and meanwhile, the external characteristics of the human face are compared with pictures in a face library of criminals to judge whether the human body is a person escaping.
Description
Technical Field
The invention belongs to the field of aircrafts, and particularly relates to a public safety autonomous inspection quad-rotor unmanned aerial vehicle based on machine vision.
Background
Unmanned aerial vehicle technology develops rapidly, and medium and small-sized unmanned aerial vehicles are applied to various scenes such as search and rescue, aerial photography, natural resource exploration, field hunting, target monitoring reconnaissance and the like by virtue of portable body types. The small-size four rotor unmanned aerial vehicle relies on its easy control, and the nimble advantage of air activity with the help of visual sensor, can obtain more environmental information, and its applied scene is also many more than other aircraft. The four-rotor unmanned aerial vehicle based on machine vision is applied to target tracking, traffic management, air navigation and the like, but at present, most of the functions can be realized by operators, the automation degree is very limited, and the unmanned aerial vehicle which is used for identifying human behaviors by applying an image identification technology in artificial intelligence is rarely applied. Only part of the object recognition technology can recognize the object by processing the image information by the ground station, and although the method can process a complex model, the time delay is much higher than that of processing the image information by an on-board image processing board.
At present, patent documents provide an implementation method of an unmanned aerial vehicle for the security field, and for example, patent documents entitled "a multi-rotor unmanned aerial vehicle dynamic security system" (with publication number of CN 207078318U) and "an unmanned aerial vehicle-based security system" (with publication number of CN 110766907A) carry an FPV image system and some functional modules through the unmanned aerial vehicle to obtain data of an environment monitored by the unmanned aerial vehicle, and then send the data to a ground station for an operator to judge whether the environment is safe or not. The two patents adopt that all information obtained by the data transmission system is transmitted to the ground station to be judged whether to be safe by operators, the process of collected image processing is completed manually, the implementation method consumes time and labor, and the autonomy and the intellectualization of the unmanned aerial vehicle are not reflected.
There is also a patent document that proposes an implementation method for identifying criminals and drawing a criminal trace map by applying a face recognition technology, such as a patent named "criminal trace map drawing system based on a face recognition technology and a method thereof" (with an authorization notice number of CN 103699677B), the implementation method of the system in the patent is to compare face image information collected by fixed cameras installed on roads and streets with face data of criminals provided by police, so as to find the trails of the criminals. The acquisition of face data by using a large number of fixed cameras is an operation with huge information amount, and the time from the acquisition of the face data of a criminal to the processing of the image data to the discovery of the criminal is also long. Compared with a method for discovering a human target and then actively acquiring target face data, the method for acquiring the face data has the advantages of low efficiency, high passivity and high randomness. Secondly, the area covered by the fixed camera is limited, and the function of reconnaissance of the criminals with high consciousness is very little. In summary, the method for searching criminal traces adopted in the patent implementation method is low in efficiency, low in reliability, poor in environmental response adaptability and low in practicability.
Disclosure of Invention
The invention aims to provide a public safety autonomous inspection quad-rotor unmanned aerial vehicle based on machine vision, which can realize the functions of automatically searching suspicious personnel in real time in all weather or escaping criminals and automatically tracking, automatically carrying out body temperature safety detection on human targets and autonomously detecting suspicious personnel clusters. The realization of these functions all need not artifical the intervention, relies on unmanned aerial vehicle self just can independently accomplish.
The technical scheme of the invention is as follows: a public safety autonomous inspection quad-rotor unmanned aerial vehicle based on machine vision is composed of a power system, a flight control system, a data processing system and a mobile terminal system; the method is characterized in that:
the power system is used for providing flight power for the unmanned aerial vehicle and providing power supplies for the flight control system and the data processing system;
the flight control system is used for controlling the flight of the quad-rotor unmanned aerial vehicle and providing a stable platform for the unmanned aerial vehicle to acquire a target image;
the data processing system is used for acquiring and processing image data, establishing communication with the mobile terminal system and sending a motion control instruction to the flight control system; the data processing system can automatically identify suspicious personnel clusters and automatically search suspicious targets or criminals after acquiring and processing image data through the high-definition camera, can control the flight control system to automatically track the targets after finding the targets, and can also automatically measure the forehead temperature of a human body target through the thermal infrared imager;
the mobile terminal system is used for completing human-computer interaction and information transmission between the mobile terminal system and the data processing system.
Further, the fuselage structure of the quad-rotor unmanned aerial vehicle comprises a rack, a rack base and a landing gear from top to bottom, wherein the rack base is two quadrilateral plates, four rotating shafts are inserted into four corners of the two quadrilateral plates, and a first quadrilateral plate is suspended on the four rotating shafts; the top ends of the four rotating shafts are provided with a rack in an X-shaped mode, the bottom ends of the four rotating shafts are connected with a second quadrilateral plate, and the lower end of the second quadrilateral plate is connected with an undercarriage; the power system and the flight control system are arranged at the upper end of the first quadrangle of the rack base, and the data processing system is arranged at the lower end of the first quadrangle of the rack base.
Furthermore, the power system consists of a brushless motor, blades, a high-rate model airplane lithium battery and a power supply voltage stabilizing plate; a brushless motor is installed respectively to four tip of frame, is fixed with the paddle on every brushless motor, and high magnification model aeroplane and model ship lithium cell sets up the first quadrilateral plate upper end at the frame base, and the power voltage regulator plate sets up at first quadrangle and the quadrangle side of second, and the height is balanced with first quadrangle.
Furthermore, the flight control system consists of an embedded flight microcontroller, an IMU integrated sensor, a monocular ultrasonic sensor, an optical flow sensor and a GPS antenna; the upper portion of the rack is provided with an embedded flying microcontroller in a suspension mode, the embedded flying microcontroller is provided with an IMU integrated sensor, the lower end of the high-magnification aeromodelling lithium battery and the side end of the power supply voltage stabilizing plate are provided with a monocular ultrasonic sensor and a light stream sensor in a parallel mode, the GPS antenna is arranged at the topmost end in a suspension mode through lengthening of one of the rotating shafts, and the IMU integrated sensor is integrated by a six-axis motion sensor, a magnetometer and a barometer.
Furthermore, the data processing system consists of an image processing module, an image acquisition device and a wireless signal receiver and is used for acquiring and processing image data, establishing communication with the mobile terminal system and sending a motion control instruction to the flight control system; the image acquisition device comprises a three-axis self-stabilization holder, a high-definition camera, an infrared light supplement lamp and a thermal infrared imager and is used for acquiring image data; the image processing module is arranged at the upper end of the second quadrangle and below the first quadrangle; the lower end of the second quadrangle is provided with a support, and the support is used for placing a three-axis self-stabilizing holder, a high-definition camera and a thermal infrared imager. The image processing module uses a deep learning image processing algorithm (Lujian, He jin Xin, Li Zheng, etc.. A target detection overview [ J ] based on deep learning, electro-optic and control, 2020,27(5):56-63. DOI:10.3969/j.issn.1671-637 X.2020.05.012.) to process image data, establishes communication with a mobile terminal system through a wireless signal receiver, and sends a motion control instruction to a flight control system through a serial port; the data processing system can automatically identify suspicious personnel clusters and automatically find suspicious targets or criminals after acquiring and processing image data through the image acquisition device, can control the flight control system to automatically track the targets after finding the targets, and can automatically measure the forehead temperature of the human body targets through the thermal infrared imager.
Furthermore, the mobile terminal system consists of an intelligent terminal and a wireless signal receiver and is used for finishing man-machine interaction and information transmission with the data processing system. The man-machine interaction comprises the steps of obtaining a deep learning target image model, collected target face image data and a control instruction from an operator to the unmanned aerial vehicle through input equipment, displaying the image data from a data processing system, the data after image processing and unmanned aerial vehicle state data to the operator through output equipment, and reminding the operator through vibration and playing prompt tones. The deep learning target image model is a convolutional neural network model (Zhouyu, Zhao Yanming. convolutional neural network application in image classification and target detection, to be summarized [ J ] computer engineering and application, 2017, 53(013): 34-41.) which is obtained by deep learning a target image through a deep learning image processing algorithm and contains target image depth characteristics, and can be used for detecting and identifying the target. The information transmission with the data processing system comprises the steps of sending information acquired through input equipment in human-computer interaction to the data processing system through the wireless signal receiver and receiving image data, data after image processing and unmanned aerial vehicle state data from the data processing system.
Further, the target can be autonomously identified and tracked off line through the image processing module.
Further, the autonomous offline recognition and tracking of the target by the image processing module specifically includes:
(1) training of a deep learning target image model: performing deep learning on a target image training set containing a target image by adopting a convolutional neural network established by a deep learning image processing algorithm to obtain a deep learning target image model containing a target image depth feature, namely a network model containing the target image depth feature;
(2) target identification: the image processing module loads a deep learning target image model obtained after training, simultaneously the image acquisition device acquires video data in real time and transmits the video data to the image processing module, the image processing module converts the video data into pictures of each frame to form a picture data set, then the image processing module performs noise filtering and color format conversion on the picture data set to obtain a picture data set containing target characteristics, and each picture obtained after the processing of the deep learning target image model of the picture data set contains the confidence of a target and the pixel height, width and pixel coordinate information of the target image in an original image;
(3) target tracking: the target tracking algorithm adopts a template matching method, the characteristic template is a target image with high confidence after target identification, and the matching method is a square error matching method or a correlation coefficient matching method, namely, the correlation between the characteristic template and the searched image is measured by using the square error or the correlation coefficient between image data matrixes.
Further, the specific process of the template matching method used in target tracking is as follows:
searching the most similar image with the characteristic template in the first frame of picture collected by the image collecting device after the image processing module obtains the target characteristic template, and outputting the pixel coordinates and pixel size of the image in the center of the image to a flight control system, controlling the aircraft or the three-axis self-stabilizing pan-tilt to move to the center of the target image by the flight control system by using the pixel coordinates of the image and keeping the state of the center of the target image at the middle position of the image, controlling the aircraft and the target to keep a proper distance by using the relative size of the pixel size of the target image, the image processing module adopts a Kalman Filter (Welch G. Kalman Filter [ J ]. Siggraph Tutorial, 2001.) or a Particle Filter (Gool, Luc. Object Tracking with Adaptive Color-Based Particle Filter [ C ]// 2002: 353-;
when the target position of the next frame is predicted, the image processing module updates the image searched by the first frame into a new characteristic template so as to search the target in the next frame of picture, if the target image is not searched in the first frame of picture, the previous characteristic template is continuously utilized to search the next frame of picture, and when the image similar to the characteristic template is not searched in the set pictures with a certain frame number, the target recognition is carried out on the picture of the current frame again so as to obtain the new target characteristic template;
in order to solve the problem that tracking drift is generated along with the time, namely the position of a tracked target image has certain offset with the position of an actual target, the image processing module automatically detects the target again after processing the image for a period of time by using a target tracking algorithm.
The realization steps of the functions of automatically searching suspicious persons or criminals, automatically tracking, automatically carrying out body temperature safety detection on human targets and automatically detecting suspicious person clusters are as follows:
(1) optionally, a high-performance computer is used for operating a deep learning image processing algorithm, a deep learning target image model of the human body target is obtained after a large number of images of the human body target are learned, and then the deep learning target image model of the human body target and face data of targets such as suspicious persons or criminals are uploaded to a data processing system of the unmanned aerial vehicle through a mobile terminal system;
(2) when the unmanned aerial vehicle works, the image acquisition device acquires images, then the image processing module processes and identifies the number of human body targets in the images, and if the number of the human body targets is too large, the human body targets are judged to be suspicious personnel clusters;
(3) after the unmanned aerial vehicle is identified as a human body target, the unmanned aerial vehicle actively moves to a position suitable for acquiring a human face image, the human body forehead temperature is acquired by using an infrared thermal imager, the human face image data is acquired by using a high-definition camera, if the human body forehead temperature exceeds a set threshold value, an alarm is sent out, the acquired human face image data and longitude and latitude coordinate data of the unmanned aerial vehicle are sent to a mobile terminal system, whether the human body temperature is normal or not is judged, simultaneously, the uploaded human face data of a suspicious person or a criminal is compared with the acquired image data to carry out human face identification, the suspicious person or the criminal is determined after the human face identification, the unmanned aerial vehicle sends the acquired image data and the longitude and latitude coordinate data of the unmanned aerial vehicle at the moment to the mobile terminal system and starts to automatically track the suspicious person or the criminal, and, the same identification will be made for the next human target.
The invention has the beneficial effects that: the technology adopts an off-line working mode, image data with huge data volume and complex processing does not need to be transmitted to a ground station for processing through image transmission, but the characteristic identification of a target in an image is completed by using an image processing module of the unmanned aerial vehicle, so that the problems of data loss, high delay and the like in wireless data transmission can be solved, the unmanned aerial vehicle can complete the functions required to be realized in real time, and the sensitivity, the response speed and the scene adaptability of an unmanned aerial vehicle system are improved. The image acquisition device simultaneously uses the high-definition camera and the thermal infrared imager, the infrared light supplement lamp is turned on at night, the unmanned aerial vehicle can work in bright and dark environments through the high-definition camera, the target temperature can be acquired through the thermal infrared imager, and the influence of the environment on the work of the unmanned aerial vehicle is greatly reduced. Unmanned aerial vehicle adopts the mobile terminal system to upload degree of depth study target image model and suspicious personnel or criminal's face image data, and is easy and simple to handle, applicable in multiple scene, and the practicality is strong. After the unmanned aerial vehicle obtains a plurality of deep learning target image models, the unmanned aerial vehicle automatically searches for a target, and automation and intellectualization of the unmanned aerial vehicle are improved. In the process of identifying the target face of the person by the unmanned aerial vehicle, the implementation method of actively moving to a proper position to obtain the face image data is adopted, so that the probability of finding dangerous targets such as criminals is improved.
Drawings
Fig. 1 is a diagram of a system architecture of an unmanned aerial vehicle of the present invention;
FIG. 2 is a front view of the drone structure of the present invention;
fig. 3 is a bottom view of the drone structure of the present invention;
fig. 4 is a left side view of the drone structure of the present invention;
FIG. 5 is a right side view of the drone structure of the present invention;
fig. 6 is a flow chart of the unmanned aerial vehicle safety patrol of the present invention;
FIG. 7 is a schematic diagram of the motion control of the present invention;
description of reference numerals: in fig. 2, 3, 4 and 5, the main view direction is the side of the body, i.e. the direction opposite to the camera, 1, which is a power system; 2-a flight control system; 3-a data processing system; 4-mobile terminal system; 5, a frame; 6, a frame base; 7-landing gear, 8-rotating shaft, 9-first quadrilateral plate, 10-second quadrilateral plate; 101-a brushless motor; 102-a blade; 103-high-rate model airplane lithium battery; 104-power voltage stabilizing plate; 201 — embedded flight microcontroller; 202-an IMU integrated sensor consisting of a six-axis motion sensor, a magnetometer and a barometer; 203-monocular ultrasonic sensors; 204 — optical flow sensor; 205-GPS antenna; 301-image processing module; 302-three-axis self-stabilizing pan-tilt; 303-high definition camera; 304-thermal infrared imager.
Detailed Description
The present invention is described in further detail below with reference to the attached drawings and examples. As shown in fig. 1-7, the public safety autonomous inspection quad-rotor unmanned aerial vehicle based on machine vision designed by the invention comprises a power system 1, a flight control system 2, a data processing system 3 and a mobile terminal system 4. The fuselage structure of the quad-rotor unmanned aerial vehicle comprises a rack 5, a rack base 6 and an undercarriage 7 from top to bottom, wherein the rack base 6 is two quadrilateral plates, four rotating shafts 8 are inserted into four corners of the two quadrilateral plates, and a first quadrilateral plate 9 is suspended on the four rotating shafts 8; the top ends of the four rotating shafts 8 are provided with a rack 5 in a supporting manner, the rack 5 is an X-shaped rack, the bottom ends of the four rotating shafts 8 are connected with a second quadrilateral plate 10, and the lower end of the second quadrilateral plate 10 is connected with an undercarriage 7; the power system 1 and the flight control system 2 are arranged at the upper end of a first quadrangle 9 of the rack base 6, and the data processing system 3 is arranged at the lower end of the first quadrangle 9 of the rack base 6.
The power system 1 consists of a brushless motor 101, blades 102, a high-magnification model airplane lithium battery 103 and a power supply voltage stabilizing plate 104; a brushless motor 101 is respectively installed at four end parts of the rack 5, a paddle 102 is fixed on each brushless motor 101, a high-magnification model airplane lithium battery 103 is arranged at the upper end of a first quadrilateral plate 9 of the rack base 6, a power supply voltage stabilizing plate 104 is vertically arranged at the side ends of the first quadrilateral 9 and a second quadrilateral 10, and the height of the power supply voltage stabilizing plate is balanced with that of the first quadrilateral 9.
The power system 1 is used for providing flight power for the unmanned aerial vehicle and providing power supply for the flight control system 2 and the data processing system 3. The high-power model airplane comprises a power device, a high-power model airplane lithium battery 103, blades 102 and a power supply pressure stabilizing plate 104. The power device can be an electric drive system with the high-rate model airplane battery 103 matched with the brushless motor 101 and the electric controller, or a direct-drive oil-driven system, and specifically comprises an oil tank, an engine, an oil feeder, a variable pitch device and a steering engine, and the power supply voltage stabilizing plate 104 is used for stabilizing the power supply voltage provided by the high-rate model airplane battery 103 to 5V direct-current voltage so as to drive the flight control system 2 and the data processing system 3.
The flight control system 2 consists of an embedded flight microcontroller 201, an IMU integrated sensor 202, a monocular ultrasonic sensor 203, an optical flow sensor 204, a GPS antenna 205 and an acoustic or optical prompting device; an embedded flight microcontroller 201 is arranged on the upper portion of the rack 5 in a hanging mode, an IMU integrated sensor 202 is arranged on the embedded flight microcontroller 201, a monocular ultrasonic sensor 203 and an optical flow sensor 204 are arranged at the lower end of the high-magnification aircraft model lithium battery 103 and the side end of the power supply pressure stabilizing plate 104 in parallel, the GPS antenna 205 is arranged at the topmost end in a hanging mode through lengthening of one of the rotating shafts 8, and the IMU integrated sensor 202 is integrated by a six-shaft motion sensor, a magnetometer and a barometer.
Flight control system 2 is used for controlling four rotor unmanned aerial vehicle's flight to for unmanned aerial vehicle provides stable platform when acquireing the target image. The aircraft comprises an embedded flight microcontroller 201, a height measuring device, an attitude sensor, a data receiving and transmitting device, a horizontal displacement sensor, a navigation system and an acoustic or optical prompting device. The flight microcontroller may employ an FPGA based platform, or an ARM based platform, or an Atmel based platform, or a Raspberry Pi based platform. The height measuring device can measure the height of the unmanned aerial vehicle, can adopt an ultrasonic distance sensor, a barometer, a laser ranging device or a GPS device, and also can adopt various modes to combine to obtain height information. The attitude sensor may be a nine-axis attitude sensor consisting of a six-axis accelerometer and a three-axis gyroscope. The data receiving and sending device is used for receiving the unmanned aerial vehicle motion control instruction, can adopt a wireless transceiver of a WiFi protocol or a Bluetooth protocol, and also can adopt a data transmission radio station communication device or a mobile communication device. Horizontal displacement sensor is used for acquireing unmanned aerial vehicle horizontal displacement data to keep unmanned aerial vehicle's level stable, can adopt the optical flow sensor. The navigation system is used for acquiring longitude and latitude and altitude information of the unmanned aerial vehicle, and adopts a GPS navigation system. The acoustic or optical prompting device is used for sending prompting acoustic or optical information, and can be an optical output device or an acoustic output device, or a device combining the two modes, wherein a loudspeaker and a light output device are adopted.
The data processing system 3 consists of an image processing module 301, a triaxial self-stabilizing holder 302, a high-definition camera 303 and a thermal infrared imager 304; the image processing module 301 is arranged at the upper end of the second quadrangle 10 and below the first quadrangle 9; the lower end of the second quadrangle 10 is provided with a support, and the support is used for placing a three-axis self-stabilizing pan-tilt 302, a high-definition camera 303 and a thermal infrared imager 304.
The data processing system 3 is used to acquire and process image data, establish communication with the mobile terminal system 4, and send motion control commands to the flight control system 2. The system comprises an image processing module 301, an image acquisition device and a wireless signal receiver, wherein the wireless signal receiver can adopt the scheme of the data receiving and transmitting device in the flight control system. The image processing module 301 is configured to process image data and establish communication with the mobile terminal system 4, and may specifically adopt Jetson AI supercomputer in england, EdgeBoard deep learning computing card in hundred-degree propeller, Arduino open source hardware board, hua shi artificial intelligence AI platform, or deep technology DP-8000 AI development board. Image acquisition device includes from steady cloud platform, high definition digtal camera and infrared light filling lamp and thermal infrared imager, wherein from steady cloud platform is used for providing stable platform for camera and thermal infrared imager, can adopt the multiaxis cloud platform, the camera can adopt monocular camera also can adopt the multiocular camera, open infrared light filling lamp when night, no matter the environment is bright or the dark can both obtain image data through high definition digtal camera, thereby realize unmanned aerial vehicle's whole day work, thermal infrared imager can obtain the target image that has temperature data.
The mobile terminal system 4 is composed of an intelligent terminal and a wireless signal receiver and is used for finishing man-machine interaction and information transmission with the data processing system 3. The wireless signal receiver can receive the data sent by the onboard data processing system 3 and can also send the operation data of the operator, and a wireless receiving and sending device of a WiFi protocol can be adopted, and a data transmission radio communication device or a mobile communication device can also be adopted. The intelligent terminal can be a computer or a smart phone with an intelligent application program, and is provided with input and output equipment. The smart application will provide an operating interface. The specific implementation process is that the mobile terminal system 4 receives data sent by the data processing system 3 through the wireless signal transceiver and then displays the data to an operator through an application program, the operator is reminded through vibration and playing warning tones, whether a command of continuous tracking is transmitted to the intelligent terminal through the application program after the operator receives the information, and the intelligent terminal sends operation command data to the airborne data processing system 3 through the wireless signal transceiver.
The working process of the invention is as follows:
(1) the method comprises the steps that an operator inputs target face image data and a deep learning target image model through mobile terminal equipment, and formulates a flight task, wherein the flight task comprises planning a flight route, setting a target, setting task content and priority, the task content comprises detection of criminal evasion identification and tracking, suspicious personnel cluster identification and abnormal body temperature personnel identification and tracking, and after data input and setting are finished, a mobile terminal system 4 sends task information to a data processing system 3.
(2) After receiving the task information and obtaining a starting instruction, the data processing system 3 starts the self-checking of the system, and if the self-checking is wrong, the data processing system 3 feeds back error information to the mobile terminal system 4; if the self-checking is finished and correct, the unmanned aerial vehicle is automatically started and then starts flying according to the set air route.
(3) After the unmanned aerial vehicle is started, the data processing system 3 sends the air route data to the flight control system 2, and the flight control system 2 controls the unmanned aerial vehicle to fly according to the set air route.
(4) In the flight process, the data processing system 3 collects image data of the surrounding environment, a human body target in the image is identified by using a deep learning image processing algorithm, the personnel density is calculated, whether a suspicious personnel cluster exists is judged, identification information is sent to the mobile terminal system 4, whether the suspicious personnel cluster exists is confirmed by an operator, if the operator is not on line, the unmanned aerial vehicle directly calibrates the GPS position of the cluster target (namely the longitude and latitude position data of the unmanned aerial vehicle at the moment), the image information of the cluster target is stored and then sent to the mobile intelligent terminal, and the unmanned aerial vehicle continuously flies according to the set air route after the execution is finished.
(5) If the tasks of criminal evasion identification and tracking and abnormal body temperature person identification and tracking are set, the data processing system 3 also calculates an optimal path moving to a position suitable for acquiring face data after being identified as a human body target, a flight control instruction is sent to the flight control system 2, after flying to the position suitable for acquiring the face data, the target face data uploaded by the mobile terminal system 4 is compared with the acquired target face data, if the target is confirmed to be a criminal evasion, the data processing system 3 sends a target image processing result to the mobile terminal system 4 to remind an operator, and if the operator is off-line, the unmanned aerial vehicle automatically tracks the criminal target until the operator sends a cancel instruction; when the target face data is scanned, the thermal infrared imager can also acquire the face infrared image data and measure the forehead temperature of the human body, if the temperature is abnormal, the data processing system 4 can also send the target image data, the data after image processing and the forehead temperature of the human body to the mobile terminal system 4, whether the abnormal forehead temperature target is tracked can be set in task setting, and the abnormal forehead temperature target can also be set in real time in the mobile terminal system 4.
(6) After the task is executed according to the air route, the unmanned aerial vehicle can automatically return to the air route.
A public safety autonomous patrol quad-rotor unmanned aerial vehicle operation flow based on machine vision is shown in figure 6. When the unmanned aerial vehicle works, the power switch on the power voltage stabilizing plate 104 is turned on at first, then the unmanned aerial vehicle starts self-checking and initialization of each system, and at this moment, the three-axis self-stabilizing pan-tilt 302 also returns to the initial position. After initialization is completed, the flight control system 2 can prompt with light and speaker prompt tones, after the mobile terminal system 4 confirms to start, the unmanned aerial vehicle automatically takes off to a set safe height, if the automatic mode is started and working data is uploaded, the unmanned aerial vehicle starts to control the three-axis self-stabilizing pan-tilt 302 to start to search for a target, otherwise, the unmanned aerial vehicle hovers in the air for a waiting instruction. After the automatic identification tracking mode is started, the data processing system 3 controls the three-axis self-stabilization pan-tilt 302 to rotate to search for a target, an image captured by the high-definition camera 303 is transmitted to the image processing module 301 in real time, the image processing module 301 detects information by using a trained model after receiving image information, when the target in a picture is identified, a target motion model is rapidly established according to the pixel coordinates of the central position of the target in the previous frame of image and the pixel size information of the target, the pixel coordinate position of the next frame of the target is predicted by filters such as a Kalman filter or a particle filter, unmanned aerial vehicle motion control data are calculated, and then the unmanned aerial vehicle motion control data are sent to the embedded flight microcontroller 201. The embedded flight microcontroller 201 rapidly controls the fuselage to a suitable position to track the target and acquire target image information. After the image data of the high-definition camera 303 and the thermal infrared imager 304 are acquired, the image processing module 301 performs feature recognition and other processing on the acquired image information to judge whether the person is a suspect or a criminal evading; whether the person is a suspicious person cluster; whether the human body temperature is normal or not; the human body temperature is obtained by measuring the forehead temperature of the human face by the thermal infrared imager 304. If the processed result is valuable, such as identifying a suspicious object of a criminal class, detecting suspicious dangerous cluster behaviors and finding out abnormal personnel with body temperature, the data processing system 3 sends image data of the suspicious object, image identification data of the image processing module 301 and GPS position data of the unmanned aerial vehicle to the mobile terminal system 4. After receiving the data, the mobile terminal system 4 presents the data to an operator through an application interface, and the operator confirms whether to perform tracking. If no operator instruction exists, the unmanned aerial vehicle can track the criminals in real time and continuously transmit data to the mobile terminal system 4.
The flight control principle of the quad-rotor unmanned aerial vehicle is shown in the attached figure 7, and a flight control system 2 consists of cascade double closed-loop PID controllers. The outer ring is position feedback and can feed back the current yaw angle, the purpose of the outer ring control is to achieve a desired angle, and the output of the outer ring PID is used as an angular velocity to the inner ring as an input, namely the desired value of the inner ring. The inner loop is attitude feedback, and the purpose is to achieve the desired angular velocity, i.e. the input of the outer loop, and the output of the inner loop is the rotation speed control parameter of the brushless motor. With control of these two loops, the drone's motion and pointing will be much smoother. The output formula of the double closed-loop PID controller is as follows, out (T) is the output formula of the PID controller of the continuous system, out (k) is the output formula of the PID controller of the discrete system, where err (k) is the deviation between the expected value and the actual value corresponding to the time k, Kp, Ki and Kd are respectively a proportional coefficient, an integral coefficient and a differential coefficient, and T is a data updating time period:
in specific implementation, the quad-rotor unmanned aerial vehicle obtains motion data and attitude data through an attitude sensor of the flight control system 2; in low altitude, the data of the ground clearance of the unmanned aerial vehicle is obtained by data obtained by various height measuring devices after passing through a Kalman filter, and in high altitude, the data is obtained by a GPS module; when the unmanned aerial vehicle is in safety inspection, the image processing module 301 in the data processing system 3 utilizes a trained model to process image information of each frame of video in real time, the number of human bodies and the personnel density in the images are obtained after processing, the model is obtained by deep learning a large number of marked target image training sets through a convolutional neural network, the personnel density is calculated through the number of the human bodies and the relative distance between the human bodies and the unmanned aerial vehicle, and the relative distance between the human bodies and the unmanned aerial vehicle is obtained by dividing the number of the pixels occupied by the human bodies and the total number of the pixels of the images; the unmanned aerial vehicle judges whether dangerous personnel clustering behaviors exist or not according to the personnel density; meanwhile, the unmanned aerial vehicle carries out face recognition on the human body target through the high-definition camera 303, the face model data are uploaded to the mobile terminal system 4 after being collected by an operator, the mobile terminal system 4 is sent to the data processing system 3 again, and the thermal infrared imager 304 is used for obtaining the head temperature data of the human body target during face recognition, so that criminals can be recognized and abnormal body temperature personnel can be recognized. Image data is obtained through high definition digtal camera 303 and infrared light filling lamp when the environment is bright or dark to realize unmanned aerial vehicle's all day work.
The above is a preferred embodiment of the present invention, but the present invention is not limited to the embodiment and the disclosure of the drawings. Therefore, it is intended that all equivalents and modifications which do not depart from the spirit of the invention disclosed herein are deemed to fall within the scope of the invention.
Claims (9)
1. A public safety autonomous inspection quad-rotor unmanned aerial vehicle based on machine vision is composed of a power system, a flight control system, a data processing system and a mobile terminal system; the method is characterized in that:
the power system is used for providing flight power for the unmanned aerial vehicle and providing power supplies for the flight control system and the data processing system;
the flight control system is used for controlling the flight of the quad-rotor unmanned aerial vehicle and providing a stable platform for the unmanned aerial vehicle to acquire a target image;
the data processing system is used for acquiring and processing image data, establishing communication with the mobile terminal system and sending a motion control instruction to the flight control system; the data processing system can automatically identify suspicious personnel clusters and automatically search suspicious targets or criminals after acquiring and processing image data through the high-definition camera, can control the flight control system to automatically track the targets after finding the targets, and can also automatically measure the forehead temperature of a human body target through the thermal infrared imager;
the mobile terminal system is used for completing human-computer interaction and information transmission between the mobile terminal system and the data processing system.
2. A machine vision based public safety autonomous patrol quad-rotor drone according to claim 1; the method is characterized in that: the four-rotor unmanned aerial vehicle body structure comprises a rack, a rack base and an undercarriage from top to bottom, wherein the rack base is two quadrilateral plates, four rotating shafts are inserted into four corners of the two quadrilateral plates, and a first quadrilateral plate is suspended on the four rotating shafts; the top ends of the four rotating shafts are provided with a rack in an X-shaped mode, the bottom ends of the four rotating shafts are connected with a second quadrilateral plate, and the lower end of the second quadrilateral plate is connected with an undercarriage; the power system and the flight control system are arranged at the upper end of the first quadrangle of the rack base, and the data processing system is arranged at the lower end of the first quadrangle of the rack base.
3. A machine vision based public safety autonomous patrol quad-rotor drone according to claim 2; the method is characterized in that: the power system consists of a brushless motor, blades, a high-magnification model airplane lithium battery and a power supply voltage stabilizing plate; a brushless motor is installed respectively to four tip of frame, is fixed with the paddle on every brushless motor, and high magnification model aeroplane and model ship lithium cell sets up the first quadrilateral plate upper end at the frame base, and the power voltage regulator plate sets up at first quadrangle and the quadrangle side of second, and the height is balanced with first quadrangle.
4. A machine vision based public safety autonomous patrol quad-rotor drone according to claim 2; the method is characterized in that: the flight control system consists of an embedded flight microcontroller, an IMU integrated sensor, a monocular ultrasonic sensor, an optical flow sensor and a GPS antenna; the upper portion of the rack is provided with an embedded flying microcontroller in a suspension mode, the embedded flying microcontroller is provided with an IMU integrated sensor, the lower end of the high-magnification aeromodelling lithium battery and the side end of the power supply voltage stabilizing plate are provided with a monocular ultrasonic sensor and a light stream sensor in a parallel mode, the GPS antenna is arranged at the topmost end in a suspension mode through lengthening of one of the rotating shafts, and the IMU integrated sensor is integrated by a six-axis motion sensor, a magnetometer and a barometer.
5. A machine vision based public safety autonomous patrol quad-rotor drone according to claim 2; the method is characterized in that: the data processing system consists of an image processing module, a three-axis self-stabilizing holder, a high-definition camera and a thermal infrared imager; the image processing module is arranged at the upper end of the second quadrangle and below the first quadrangle; the lower end of the second quadrangle is provided with a support, and the support is used for placing a three-axis self-stabilizing holder, a high-definition camera and a thermal infrared imager.
6. A machine vision based public safety autonomous patrol quad-rotor drone according to claim 2; the method is characterized in that: the mobile terminal system is composed of an intelligent terminal and a wireless signal receiver-transmitter, the mobile terminal system receives data sent by the data processing system through the wireless signal receiver-transmitter and then displays the data to an operator, the operator is reminded through vibration and a prompt tone playing mode, whether a continuously tracking instruction is transmitted to the intelligent terminal after the operator receives information, and the intelligent terminal sends operation instruction data to an airborne data processing system through the wireless signal receiver-transmitter.
7. A public safety autonomous patrol quad-rotor unmanned aerial vehicle based on machine vision according to claim 5, characterized in that the target can be autonomously identified and tracked off-line through an image processing module.
8. A machine vision based public safety autonomous patrol quad-rotor drone according to claim 7, characterized in that: the method can be used for carrying out autonomous offline identification and tracking on the target through the image processing module, and specifically comprises the following steps:
(1) training of a deep learning target image model: performing deep learning on a target image training set containing a target image by adopting a convolutional neural network established by a deep learning image processing algorithm to obtain a deep learning target image model containing a target image depth feature, namely a network model containing the target image depth feature;
(2) target identification: the image processing module loads a deep learning target image model obtained after training, simultaneously the image acquisition device acquires video data in real time and transmits the video data to the image processing module, the image processing module converts the video data into pictures of each frame to form a picture data set, then the image processing module performs noise filtering and color format conversion on the picture data set to obtain a picture data set containing target characteristics, and each picture obtained after the processing of the deep learning target image model of the picture data set contains the confidence of a target and the pixel height, width and pixel coordinate information of the target image in an original image;
(3) target tracking: the target tracking algorithm adopts a template matching method, the characteristic template is a target image with high confidence after target identification, and the matching method is a square error matching method or a correlation coefficient matching method, namely, the correlation between the characteristic template and the searched image is measured by using the square error or the correlation coefficient between image data matrixes.
9. A machine vision based public safety autonomous patrol quad-rotor unmanned aerial vehicle according to claim 8, wherein: the specific process of the template matching method used in target tracking is as follows:
searching an image most similar to the characteristic template in a first frame of picture acquired by the image acquisition device at the moment after the image processing module acquires the target characteristic template, outputting the central pixel coordinate and the pixel size of the image in the picture to a flight control system, controlling the airplane or a three-axis self-stabilizing pan-tilt to move to a state that the center of the target image is kept at the middle position of the picture by the flight control system by using the pixel coordinate of the image, controlling the airplane and the target to keep a proper distance by using the relative size of the pixel size of the target image, and predicting the position of the next frame of target in the image by using a Kalman filter or a particle filter by the image processing module;
when the target position of the next frame is predicted, the image processing module updates the image searched by the first frame into a new characteristic template so as to search the target in the next frame of picture, if the target image is not searched in the first frame of picture, the previous characteristic template is continuously utilized to search the next frame of picture, and when the image similar to the characteristic template is not searched in the set pictures with a certain frame number, the target recognition is carried out on the picture of the current frame again so as to obtain the new target characteristic template;
in order to solve the problem that tracking drift is generated along with the time, namely the position of a tracked target image has certain offset with the position of an actual target, the image processing module automatically detects the target again after processing the image for a period of time by using a target tracking algorithm.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010691797.XA CN111824406A (en) | 2020-07-17 | 2020-07-17 | Public safety independently patrols four rotor unmanned aerial vehicle based on machine vision |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010691797.XA CN111824406A (en) | 2020-07-17 | 2020-07-17 | Public safety independently patrols four rotor unmanned aerial vehicle based on machine vision |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111824406A true CN111824406A (en) | 2020-10-27 |
Family
ID=72924302
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010691797.XA Pending CN111824406A (en) | 2020-07-17 | 2020-07-17 | Public safety independently patrols four rotor unmanned aerial vehicle based on machine vision |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111824406A (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112394746A (en) * | 2020-11-23 | 2021-02-23 | 武汉科技大学 | Intelligent epidemic prevention unmanned aerial vehicle based on machine learning and control method thereof |
CN113052115A (en) * | 2021-04-06 | 2021-06-29 | 合肥工业大学 | Unmanned aerial vehicle airborne vital sign detection method based on video method |
CN113111715A (en) * | 2021-03-13 | 2021-07-13 | 浙江御穹电子科技有限公司 | Unmanned aerial vehicle target tracking and information acquisition system and method |
CN113119082A (en) * | 2021-03-18 | 2021-07-16 | 深圳市优必选科技股份有限公司 | Visual recognition circuit, visual recognition device, and robot |
CN113268071A (en) * | 2021-01-28 | 2021-08-17 | 北京理工大学 | Unmanned aerial vehicle tracing method and system based on multi-sensor fusion |
CN113306741A (en) * | 2021-04-16 | 2021-08-27 | 西安航空职业技术学院 | External winding inspection unmanned aerial vehicle and method based on deep learning |
CN113625777A (en) * | 2021-09-22 | 2021-11-09 | 福建江夏学院 | Multifunctional flight control circuit and method based on unmanned aerial vehicle |
CN116778360A (en) * | 2023-06-09 | 2023-09-19 | 北京科技大学 | Ground target positioning method and device for flapping-wing flying robot |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106570892A (en) * | 2015-08-18 | 2017-04-19 | 航天图景(北京)科技有限公司 | Moving-target active tracking method based on edge enhancement template matching |
CN106774436A (en) * | 2017-02-27 | 2017-05-31 | 南京航空航天大学 | The control system and method for the rotor wing unmanned aerial vehicle tenacious tracking target of view-based access control model |
WO2017115120A1 (en) * | 2015-12-29 | 2017-07-06 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for automated aerial system operation |
CN107817820A (en) * | 2017-10-16 | 2018-03-20 | 复旦大学 | A kind of unmanned plane autonomous flight control method and system based on deep learning |
CN107851358A (en) * | 2015-07-09 | 2018-03-27 | 诺基亚技术有限公司 | Monitoring |
US20180107874A1 (en) * | 2016-01-29 | 2018-04-19 | Panton, Inc. | Aerial image processing |
CN109324638A (en) * | 2018-12-05 | 2019-02-12 | 中国计量大学 | Quadrotor drone Target Tracking System based on machine vision |
US20190057244A1 (en) * | 2017-08-18 | 2019-02-21 | Autel Robotics Co., Ltd. | Method for determining target through intelligent following of unmanned aerial vehicle, unmanned aerial vehicle and remote control |
CN109787679A (en) * | 2019-03-15 | 2019-05-21 | 郭欣 | Police infrared arrest system and method based on multi-rotor unmanned aerial vehicle |
CN110232307A (en) * | 2019-04-04 | 2019-09-13 | 中国石油大学(华东) | A kind of multi-frame joint face recognition algorithms based on unmanned plane |
CN110673641A (en) * | 2019-10-28 | 2020-01-10 | 上海工程技术大学 | Passenger plane intelligent maintenance inspection system platform based on unmanned aerial vehicle |
CN111275760A (en) * | 2020-01-16 | 2020-06-12 | 上海工程技术大学 | Unmanned aerial vehicle target tracking system and method based on 5G and depth image information |
-
2020
- 2020-07-17 CN CN202010691797.XA patent/CN111824406A/en active Pending
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107851358A (en) * | 2015-07-09 | 2018-03-27 | 诺基亚技术有限公司 | Monitoring |
CN106570892A (en) * | 2015-08-18 | 2017-04-19 | 航天图景(北京)科技有限公司 | Moving-target active tracking method based on edge enhancement template matching |
WO2017115120A1 (en) * | 2015-12-29 | 2017-07-06 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for automated aerial system operation |
US20180107874A1 (en) * | 2016-01-29 | 2018-04-19 | Panton, Inc. | Aerial image processing |
CN106774436A (en) * | 2017-02-27 | 2017-05-31 | 南京航空航天大学 | The control system and method for the rotor wing unmanned aerial vehicle tenacious tracking target of view-based access control model |
US20190057244A1 (en) * | 2017-08-18 | 2019-02-21 | Autel Robotics Co., Ltd. | Method for determining target through intelligent following of unmanned aerial vehicle, unmanned aerial vehicle and remote control |
CN107817820A (en) * | 2017-10-16 | 2018-03-20 | 复旦大学 | A kind of unmanned plane autonomous flight control method and system based on deep learning |
CN109324638A (en) * | 2018-12-05 | 2019-02-12 | 中国计量大学 | Quadrotor drone Target Tracking System based on machine vision |
CN109787679A (en) * | 2019-03-15 | 2019-05-21 | 郭欣 | Police infrared arrest system and method based on multi-rotor unmanned aerial vehicle |
CN110232307A (en) * | 2019-04-04 | 2019-09-13 | 中国石油大学(华东) | A kind of multi-frame joint face recognition algorithms based on unmanned plane |
CN110673641A (en) * | 2019-10-28 | 2020-01-10 | 上海工程技术大学 | Passenger plane intelligent maintenance inspection system platform based on unmanned aerial vehicle |
CN111275760A (en) * | 2020-01-16 | 2020-06-12 | 上海工程技术大学 | Unmanned aerial vehicle target tracking system and method based on 5G and depth image information |
Non-Patent Citations (1)
Title |
---|
鱼滨: "《基于MATLAB和遗传算法的图像处理》", 1 September 2015, 西安电子科技大学出版社 * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112394746A (en) * | 2020-11-23 | 2021-02-23 | 武汉科技大学 | Intelligent epidemic prevention unmanned aerial vehicle based on machine learning and control method thereof |
CN113268071A (en) * | 2021-01-28 | 2021-08-17 | 北京理工大学 | Unmanned aerial vehicle tracing method and system based on multi-sensor fusion |
CN113111715A (en) * | 2021-03-13 | 2021-07-13 | 浙江御穹电子科技有限公司 | Unmanned aerial vehicle target tracking and information acquisition system and method |
CN113111715B (en) * | 2021-03-13 | 2023-07-25 | 浙江御穹电子科技有限公司 | Unmanned aerial vehicle target tracking and information acquisition system and method |
CN113119082A (en) * | 2021-03-18 | 2021-07-16 | 深圳市优必选科技股份有限公司 | Visual recognition circuit, visual recognition device, and robot |
CN113052115A (en) * | 2021-04-06 | 2021-06-29 | 合肥工业大学 | Unmanned aerial vehicle airborne vital sign detection method based on video method |
CN113306741A (en) * | 2021-04-16 | 2021-08-27 | 西安航空职业技术学院 | External winding inspection unmanned aerial vehicle and method based on deep learning |
CN113625777A (en) * | 2021-09-22 | 2021-11-09 | 福建江夏学院 | Multifunctional flight control circuit and method based on unmanned aerial vehicle |
CN116778360A (en) * | 2023-06-09 | 2023-09-19 | 北京科技大学 | Ground target positioning method and device for flapping-wing flying robot |
CN116778360B (en) * | 2023-06-09 | 2024-03-19 | 北京科技大学 | Ground target positioning method and device for flapping-wing flying robot |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111824406A (en) | Public safety independently patrols four rotor unmanned aerial vehicle based on machine vision | |
US11604479B2 (en) | Methods and system for vision-based landing | |
CN107531322B (en) | Aerial capture platform | |
KR102254491B1 (en) | Automatic fly drone embedded with intelligent image analysis function | |
US20200026720A1 (en) | Construction and update of elevation maps | |
EP3901728A1 (en) | Methods and system for autonomous landing | |
CN106774436A (en) | The control system and method for the rotor wing unmanned aerial vehicle tenacious tracking target of view-based access control model | |
CN110692027A (en) | System and method for providing easy-to-use release and automatic positioning of drone applications | |
CN107817820A (en) | A kind of unmanned plane autonomous flight control method and system based on deep learning | |
WO2018103689A1 (en) | Relative azimuth control method and apparatus for unmanned aerial vehicle | |
CN110494360A (en) | For providing the autonomous system and method photographed and image | |
CN201217501Y (en) | Suspending type aviation camera shooting self-determination aircraft system | |
CN105182992A (en) | Unmanned aerial vehicle control method and device | |
CN108062108A (en) | A kind of intelligent multi-rotor unmanned aerial vehicle and its implementation based on airborne computer | |
CN110333735B (en) | System and method for realizing unmanned aerial vehicle water and land secondary positioning | |
CN110498039B (en) | Intelligent monitoring system based on bionic flapping wing aircraft | |
JP2017500650A (en) | System and method for data recording and analysis | |
CN102190081B (en) | Vision-based fixed point robust control method for airship | |
CN114967737A (en) | Aircraft control method and aircraft | |
CN206532142U (en) | A kind of rotor wing unmanned aerial vehicle tenacious tracking of view-based access control model moves the control system of target | |
CN107515622A (en) | A kind of rotor wing unmanned aerial vehicle autonomous control method of drop in mobile target | |
CN106970649A (en) | A kind of unmanned plane wireless charging automatic control platform and control method | |
CN106094876A (en) | A kind of unmanned plane target locking system and method thereof | |
CN104816829A (en) | Skyeye aircraft applicable to investigation | |
CN105334347A (en) | Particle image velocimetry system and method based on unmanned plane |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20201027 |