CN113141459B - Unmanned aerial vehicle airborne vision intelligent processing system and method - Google Patents

Unmanned aerial vehicle airborne vision intelligent processing system and method Download PDF

Info

Publication number
CN113141459B
CN113141459B CN202011111242.XA CN202011111242A CN113141459B CN 113141459 B CN113141459 B CN 113141459B CN 202011111242 A CN202011111242 A CN 202011111242A CN 113141459 B CN113141459 B CN 113141459B
Authority
CN
China
Prior art keywords
lens
image
unmanned aerial
aerial vehicle
gyroscope
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011111242.XA
Other languages
Chinese (zh)
Other versions
CN113141459A (en
Inventor
宋韬
李希明
莫雳
金忍
林德福
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN202011111242.XA priority Critical patent/CN113141459B/en
Publication of CN113141459A publication Critical patent/CN113141459A/en
Application granted granted Critical
Publication of CN113141459B publication Critical patent/CN113141459B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Gyroscopes (AREA)

Abstract

The invention discloses an onboard vision intelligent processing system and method for an unmanned aerial vehicle, wherein the system comprises the unmanned aerial vehicle, a photoelectric pod and an information processing module, the photoelectric pod and the information processing module are carried on the unmanned aerial vehicle, the information processing module comprises an image recognition tracking sub-module, a holder PID (proportion integration differentiation) controller sub-module and a parameter correction sub-module, the parameter correction sub-module acquires detection data of a gyroscope, an accelerometer and a magnetometer, and corrects a lens rotation angle rate parameter output by the image recognition tracking sub-module through the detection data, and the method comprises the following steps: the method comprises the steps of collecting an image, identifying a target in the image, outputting a lens rotation angle rate parameter, correcting the lens rotation angle rate parameter, and controlling the rotation of a photoelectric pod lens through a PID controller. The unmanned aerial vehicle airborne vision intelligent processing system and method can solve or reduce the over-rotation phenomenon, the photoelectric pod lens rotates stably, the lens shaking is greatly reduced, and the shot image is clearer.

Description

Unmanned aerial vehicle airborne vision intelligent processing system and method
Technical Field
The invention relates to an airborne visual intelligent processing system and method for an unmanned aerial vehicle, and belongs to the field of unmanned aerial vehicles.
Background
In recent years, unmanned aerial vehicles have been vigorously developed, have experienced transition from military use to civilization use, and have successfully become a practical tool recognized by the majority of industries and capable of replacing manpower to complete complex work, wherein the continuous tracking of targets by using unmanned aerial vehicles is a common task.
In to the target tracking task, traditional mode is that unmanned aerial vehicle carries the camera through carrying on, for example carries on monocular camera or two mesh cameras, but, the target relative position information that this kind of mode acquireed, need compensate unmanned aerial vehicle's current motion state, just can make relative position information accurate to camera lens is fixed with the unmanned aerial vehicle, and unmanned aerial vehicle's flight gesture can influence the continuous tracking effect of camera to the target, and the target phenomenon of losing appears often.
Still deposit unmanned aerial vehicle and carry on cloud platform among the prior art and shoot the target image, and then obtain the tracking mode of target relative position, however current cloud platform PID control algorithm, when control cloud platform camera lens is rotatory, the phenomenon of overrotation all can appear, for example when the cloud platform is required to rotate from 20 to 40 in reality, under traditional PID control, the cloud platform can be from 20 to overrotation to 45, then circle to 35, again rotate to 40, the phenomenon of overrotation appears many times even, if the target translation rate is too fast this moment, very easily lose the target, and when unmanned aerial vehicle acutely moved, conventional PID control algorithm stability was relatively poor, cause the losing of target easily.
In addition, present unmanned aerial vehicle tracking mode, mostly unmanned aerial vehicle shoot behind the image, transmit the image to the ground station, control unmanned aerial vehicle by the ground station and trail, but this process not only influences the timeliness of system, and the integrality of transmission information also is difficult to guarantee moreover.
Therefore, it is necessary to design an intelligent processing system and method for airborne vision of an unmanned aerial vehicle, which can ensure that the unmanned aerial vehicle efficiently, stably and rapidly tracks a target, enable a worker to clearly understand the state of the unmanned aerial vehicle, and perform manual control when necessary.
Disclosure of Invention
In order to overcome the problems, the inventor of the present invention has conducted intensive research, and on one hand, provides an airborne visual intelligent processing system of an unmanned aerial vehicle, which includes the unmanned aerial vehicle, and a photoelectric pod and an information processing module mounted on the unmanned aerial vehicle.
Further, the photoelectric pod is provided with an inertia measurement unit for detecting attitude information of the photoelectric pod, and the inertia measurement unit comprises a gyroscope, a geomagnetic meter and an accelerometer;
the information processing module comprises an image identification tracking sub-module, a holder PID controller sub-module and a parameter modification sub-module,
the image identification tracking submodule is used for identifying a target in an image, calculating a rotation angle required by moving the target to a photoelectric pod lens at the middle position of the image, and outputting a lens rotation angle rate parameter;
the parameter correction submodule is used for correcting the lens rotation angle rate parameter output by the image recognition tracking submodule;
and the cradle head PID controller submodule receives the corrected lens rotation angle rate parameter and uses the corrected lens rotation angle rate parameter as the input of a PID controller to control the rotation of a motor in the photoelectric pod.
According to the invention, the parameter correction submodule receives the lens rotation angular rate parameter output by the image recognition tracking submodule, acquires detection data of a gyroscope, an accelerometer and a geomagnetism meter, and corrects the lens rotation angular rate parameter through the detection data.
In a preferred embodiment, the unmanned aerial vehicle airborne visual intelligent processing system further comprises a screen projector, wherein the screen projector is connected with the information processing module and transmits the image received by the information processing module to the ground base station.
On the other hand, the invention also provides an airborne visual intelligent processing method of the unmanned aerial vehicle, which comprises the following steps:
s1, collecting an image;
s2, identifying the target in the image and outputting a lens rotation angle rate parameter;
and S3, correcting the rotation angle rate parameter of the lens, and controlling the rotation of the photoelectric pod lens through a PID controller.
In step S1, the capturing the image is to capture the image through the optoelectronic pod and transfer the captured image to the image recognition tracking sub-module.
In step S2, the image is identified by the image identification and tracking submodule, which is provided with a neural network, and the target is identified by the neural network, and the rotation angular rate parameter of the electro-optic pod lens when the target is moved to the middle position of the image is calculated.
In step S3, the correction of the lens rotation angular rate parameter is performed by fusing information detected by the inertial measurement unit, which includes gyroscope detection information, geomagnetism detection information, and speedometer detection information, with the lens rotation angular rate parameter.
In a preferred embodiment, step S3 includes the following sub-steps:
s31, fusing gyroscope detection information to obtain a gyroscope fusion attitude quaternion;
s32, fusing geomagnetic meter detection information and speedometer detection information to obtain a geomagnetic meter and accelerometer fusion attitude quaternion;
and S33, linearly fusing the gyroscope fusion attitude quaternion and the geomagnetism and accelerometer fusion attitude quaternion to finish the correction, and transmitting the corrected attitude quaternion to the PID controller to further realize the steering of the lens.
Further, in step S31, the fused gyroscope attitude quaternion after the fusion of the gyroscopes may be expressed as:
Figure BDA0002728668780000041
wherein the content of the first and second substances,
Figure BDA0002728668780000042
an attitude quaternion estimation value representing the rotation angle rate of the lens at the time of t-1 is obtained by filtering the attitude quaternion representing the rotation angle rate parameter of the lens through a Kalman filter,
Figure BDA0002728668780000043
is the differential of the attitude quaternion, delta t is the detection period of the gyroscope, the differential of the attitude quaternion
Figure BDA0002728668780000044
Can be obtained by the following formula:
Figure BDA0002728668780000045
Sωtangular velocity components of the respective axes in the information are detected for the gyroscope.
The invention has the advantages that:
(1) according to the unmanned aerial vehicle airborne vision intelligent processing system and method provided by the invention, the over-rotation phenomenon can be solved or reduced;
(2) according to the unmanned aerial vehicle airborne vision intelligent processing system and method provided by the invention, the photoelectric pod lens rotates stably, the lens shaking is greatly reduced, and the shot image is clearer.
(3) According to the unmanned aerial vehicle airborne vision intelligent processing system and method provided by the invention, image transmission does not occupy the bandwidth of a communication module, communication interference is reduced, and the reliability of the unmanned aerial vehicle is improved;
(4) according to the network communication system and the network communication method, the load of the server is reduced.
Drawings
Fig. 1 shows a schematic structural diagram of an onboard vision intelligent processing system of a unmanned aerial vehicle according to a preferred embodiment of the invention;
fig. 2 is a flow chart of an intelligent processing method for airborne vision of the unmanned aerial vehicle according to a preferred embodiment of the invention.
Detailed Description
The invention is explained in more detail below with reference to the figures and examples. The features and advantages of the present invention will become more apparent from the description.
The word "exemplary" is used exclusively herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments. While the various aspects of the embodiments are presented in drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The invention provides an airborne visual intelligent processing system of an unmanned aerial vehicle, which comprises the unmanned aerial vehicle, a photoelectric pod and an information processing module, wherein the photoelectric pod and the information processing module are carried on the unmanned aerial vehicle.
Unmanned aerial vehicle has flight control unit and communication module, and the flight control unit is used for controlling unmanned aerial vehicle's flight, and communication module is used for communicating with ground basic station for ground basic station can control unmanned aerial vehicle.
The photoelectric pod is provided with a lens and an inertia measurement unit, the lens is used for shooting images, and the inertia measurement unit is used for detecting attitude information of the photoelectric pod and comprises a gyroscope, a geomagnetic meter and an accelerometer.
Further, in the present invention, the inertia measurement unit may be integrated inside the photoelectric pod or mounted outside the photoelectric pod, and is not particularly limited thereto.
The information processing module is electrically connected with the photoelectric pod and the unmanned aerial vehicle flight control unit.
Further, the photoelectric pod is mounted at the bottom of the unmanned aerial vehicle; the information processing module is used for resolving the image shot by the photoelectric pod and controlling the photoelectric pod and the unmanned aerial vehicle according to the resolving result.
In a preferred embodiment, the optoelectronic pod is a two-degree-of-freedom optoelectronic pod with a lens that can be rotated in two mutually perpendicular directions.
In the invention, the lens carried by the photoelectric pod is not particularly limited, and can be a visible light lens, an infrared lens or other lenses, and the person skilled in the art can select the lens according to actual needs.
In a preferred embodiment, the photoelectric pod is connected with the information processing module through a universal USB or a serial port, and the universal quick interface can realize quick replacement of different types of photoelectric pods when different tasks are executed.
According to the invention, the information processing module comprises an image identification tracking sub-module, a pan-tilt-pan PID controller sub-module and a parameter correction sub-module.
The image recognition tracking submodule is used for recognizing a target in an image, calculating a rotation angle required by the photoelectric pod lens for moving the target to the middle position of the image, and outputting a lens rotation angle rate parameter.
The inventor finds that a lens rotation angle rate parameter is directly used as an input of a pan-tilt PID controller, and a serious over-rotation phenomenon occurs, for example, when the pan-tilt is actually required to rotate from 20 ° to 40 °, the pan-tilt is over-rotated from 20 ° to 45 °, then revolved to 35 °, and then rotated to 40 °, even multiple over-rotation phenomena occur, which not only causes a large rotation angle, increases blurred images in a captured image, affects image recognition accuracy, but also easily causes a phenomenon of losing a target if the target moves at a too fast speed.
In the invention, a parameter correction submodule is arranged in an information processing module and is used for correcting the lens rotation angle rate parameter output by the image identification tracking submodule so as to solve or lighten the over-rotation phenomenon.
Specifically, the parameter correction submodule can acquire detection data of a gyroscope, an accelerometer and a geomagnetism meter, receive a lens rotation angular rate parameter output by the image recognition tracking submodule, correct the lens rotation angular rate parameter through the detection data, and output the corrected lens rotation angular rate parameter to the holder PID controller submodule.
And the cradle head PID controller submodule receives the corrected lens rotation angle rate parameter and uses the corrected lens rotation angle rate parameter as the input of the PID controller so as to control the rotation of a motor in the photoelectric pod, thereby realizing the steering of the lens.
Through the detection data with gyroscope, accelerometer and geomagnetism meter and the cloud platform rotation angle of receipt fuse, can make the rotatory stability of photoelectricity nacelle camera lens, the camera lens rocks and reduces by a wide margin, and it is more clear to shoot the image, and the camera lens over-rotation phenomenon obviously reduces, tracks more stably to the target.
In the present invention, the specific hardware structure of the information processing module is not particularly limited, and a high-performance arithmetic processing device, such as Intel NUC, NVIDIA TX2, hua engine ibax-R1000, or the like, is preferably used.
Preferably, unmanned aerial vehicle machine carries vision intelligence processing system still includes throws the screen ware, throw the screen ware and be connected with information processing module, transmit the image that information processing module received to ground basic station for the staff obtains unmanned aerial vehicle's state.
The screen projector is a device capable of wirelessly transmitting image information, and is provided with an independent signal processing transmitting chip, such as RX5808, a FPV-40Ch-RC and the like, a wireless screen projector matched with a NUC is preferably adopted, the image information shot by the unmanned aerial vehicle is transmitted by using the screen projector, compared with the traditional method of directly transmitting the image information by using a communication module, the calculation amount of the information processing module is reduced, the information processing module can more efficiently identify and process the image, meanwhile, the image transmission does not occupy the bandwidth of the communication module, the communication interference is reduced, and the reliability of the unmanned aerial vehicle is improved.
On the other hand, the invention also provides an airborne visual intelligent processing method of the unmanned aerial vehicle, which comprises the following steps:
s1, collecting an image;
s2, identifying the target in the image and outputting a lens rotation angle rate parameter;
and S3, correcting the rotation angle rate parameter of the lens, and controlling the rotation of the photoelectric pod lens through a PID controller.
In step S1, the capturing the image is to capture the image through the optoelectronic pod and transfer the captured image to the image recognition tracking sub-module.
In a preferred embodiment, before the image is captured, a step S0 of selecting photoelectric pods of different lens types according to the task requirement may be further included.
Specifically, when the task environment light is good, the photoelectric pod of the visible light lens is selected; when the task environment light is poor, the photoelectric pod using the infrared lens is used, preferably, the environment light is better that the illumination intensity is greater than two million luxes, and a high-definition image can be obtained by adopting the visible light lens, so that the recognition degree is improved; when the illumination intensity is lower than or equal to two million luxes, an infrared lens is adopted to obtain a relatively clear image with low noise, and the recognition error rate is reduced.
In step S2, the image is identified by the image identification and tracking submodule, a neural network is provided in the image identification and tracking submodule, the target is identified by the neural network, and the rotation angular rate parameter of the photoelectric pod lens when the target is moved to the middle position of the image is calculated.
In the present invention, the specific method for identifying the target in the image and outputting the lens rotation angular rate parameter is not particularly limited, and may be any known method, for example, the methods in patent nos. CN201520131508.5 and CN201710014199.7, and no further description is given here.
In step S3, the correction of the lens rotation angular rate parameter is performed by fusing the information detected by the inertial measurement unit with the lens rotation angular rate parameter.
Further, the information detected by the inertial measurement unit includes gyroscope detection information, geomagnetism meter detection information, and speedometer detection information.
In the invention, the rotation angle rate parameter of the lens of the photoelectric pod is described by adopting an attitude quaternion, the quaternion method is proposed by England mathematician W.R. Haminlton in 1843, and the rotation angle rate parameter of the lens of the photoelectric pod is expressed by adopting a gyroscope coordinate system S, and further, the rotation angle rate parameter of the lens of the photoelectric pod is provided by England mathematician W.R. Haminlton and is a common attitude calculation mode.
Specifically, the method comprises S31, fusing gyroscope detection information, and obtaining a gyroscope fusion attitude quaternion;
s32, fusing geomagnetic meter detection information and speedometer detection information to obtain a geomagnetic meter and accelerometer fusion attitude quaternion;
and S33, linearly fusing the gyroscope fusion attitude quaternion and the geomagnetism and accelerometer fusion attitude quaternion to finish the correction, and transmitting the corrected attitude quaternion to the PID controller to further realize the steering of the lens.
In step S31, at time t, the pose quaternion characterizing the lens rotation angular rate parameter passed by the image recognition and tracking sub-module is:
Figure BDA0002728668780000091
after the attitude quaternion representing the lens rotation angle rate parameter transmitted by the image recognition and tracking submodule is filtered by the Kalman filter, the attitude quaternion estimated value representing the lens rotation angle rate at the last moment (t-1 moment) can be obtained
Figure BDA0002728668780000092
In the invention, the gyroscope fusion attitude quaternion is obtained by fusing the gyroscope detection information with the gyroscope detection information, and can be expressed as:
Figure BDA0002728668780000093
wherein the content of the first and second substances,
Figure BDA0002728668780000094
and delta t is the detection period of the gyroscope.
Further, differentiation of attitude quaternion
Figure BDA0002728668780000095
Can be obtained by the following formula
Figure BDA0002728668780000101
Wherein the content of the first and second substances,Sωtangular velocity components of the respective axes in the information are detected for the gyroscope.
In step S32, kalman filtering processing needs to be performed on the accelerometer detection information, where the accelerometer detection information after the kalman filtering processing is represented as:
Figure BDA0002728668780000102
gradient of acceleration function
Figure BDA0002728668780000103
Expressed as:
Figure BDA0002728668780000104
wherein the content of the first and second substances,
Figure BDA0002728668780000105
a jacobian matrix representing the derivative of the error function of the accelerometer characterization,
Figure BDA0002728668780000106
Figure BDA0002728668780000107
the formula five and the formula six are substituted into the formula four, so that the gradient of the acceleration function can be obtained
Figure BDA0002728668780000108
The detection information of the geomagnetism is
Figure BDA0002728668780000109
In combination with the local earth magnetic field vector
Figure BDA00027286687800001010
The gradient of the earth magnetic field vector error function can be obtained
Figure BDA00027286687800001011
Figure BDA00027286687800001012
The gradient of the acceleration function
Figure BDA00027286687800001013
Gradient of vector error function of earth magnetic field
Figure BDA00027286687800001014
The total error function gradient can be obtained simultaneously
Figure BDA00027286687800001015
Figure BDA0002728668780000111
Iteratively graduating the total error function using a gradient descent method
Figure BDA0002728668780000112
And
Figure BDA0002728668780000113
and fusing to obtain a lens rotation angle rate parameter attitude quaternion fused with the geomagnetism and the accelerometer, namely a geomagnetism and accelerometer fused attitude quaternion:
Figure BDA0002728668780000114
Figure BDA0002728668780000115
wherein, α is an accelerometer noise amplification factor, which is related to the accelerometer hardware used and can be obtained from accelerometer factory specifications or obtained by performing related hardware detection on the accelerometer.
In step S33, the gyroscope-fused-attitude quaternion and the geomagnetism-accelerometer-fused-attitude quaternion are linearly fused, specifically, the linear fusion is performed in the following manner to obtain a modified attitude quaternion
Figure BDA0002728668780000116
Figure BDA0002728668780000117
Where γ is a dynamic weight that changes with the motion state of the optoelectronic pod, and can be expressed as:
Figure BDA0002728668780000118
Figure BDA0002728668780000119
the quaternion convergence rate of the accelerometer and the quaternion divergence rate of the gyroscope measurement error are beta, and the beta is related to the used gyroscope hardware and can be obtained from the gyroscope factory specifications or obtained by performing related hardware detection on the gyroscope.
According to the invention, the corrected attitude quaternion is applied
Figure BDA00027286687800001110
And the input is transmitted to the PID controller and used as the input of the PID controller, and the PID controller controls the rotation of the photoelectric pod motor according to the input, so that the steering of the lens is realized.
In a preferred embodiment, since the accelerometer quaternion convergence rate is much greater than the gyroscope divergence rate, there is:
Figure BDA0002728668780000121
substituting the formula into ten and eleven to obtain the corrected attitude quaternion
Figure BDA0002728668780000122
Can also be expressed as:
Figure BDA0002728668780000123
the formula twelve simplifies the calculation amount of the correction process, and improves the calculation speed and the response sensitivity under the condition that the steering control effect of the lens is the same.
Examples
Example 1
Carrying out a simulation experiment, and tracking a moving target by using an unmanned aerial vehicle carrying a photoelectric pod, an information processing module and a screen projector, wherein the moving target is a small multi-rotor aircraft which rapidly flies around a splayed shape; the photoelectric pod carried on the unmanned aerial vehicle is a two-degree-of-freedom photoelectric pod, the lens adopts a visible light lens, and the information processing module comprises an image recognition tracking sub-module, a holder PID controller sub-module and a parameter correction sub-module.
In step S1, the drone takes an image through the photoelectric pod;
in step S2, the image is identified by the image identification and tracking submodule, and the rotation angular rate parameter of the photoelectric pod lens is calculated when the target is moved to the middle position of the image;
in step S3, the information detected by the inertial measurement unit is fused with the lens rotation angular rate parameter by the parameter correction submodule to realize lens rotation angular rate parameter correction, and the PID controller controls the rotation of the photoelectric pod lens with the corrected lens rotation angular rate parameter as an input. In step S31, a gyroscope fusion attitude quaternion is obtained by fusing gyroscope detection information of formula two:
Figure BDA0002728668780000131
wherein the content of the first and second substances,
Figure BDA0002728668780000132
and delta t is the detection period of the gyroscope.
Differentiation of attitude quaternion
Figure BDA0002728668780000133
Obtained by the following formula
Figure BDA0002728668780000134
Wherein the content of the first and second substances,Sωtdetecting angular velocity components of the respective axes in the information for the gyroscope;
in step S32, a geomagnetism and accelerometer fused attitude quaternion is obtained by fusing the geomagnetism detection information and the speedometer detection information by formula nine:
Figure BDA0002728668780000135
Figure BDA0002728668780000136
wherein the noise amplification coefficient of the accelerometer is alpha 0.000308, and the total error function gradient
Figure BDA0002728668780000137
Obtained by the formula eight:
Figure BDA0002728668780000138
gradient of acceleration function
Figure BDA0002728668780000139
Gradient of vector error function of geomagnetic field
Figure BDA00027286687800001310
Respectively expressed as:
Figure BDA00027286687800001311
Figure BDA00027286687800001312
wherein
Figure BDA0002728668780000141
Figure BDA0002728668780000142
Figure BDA0002728668780000143
The accelerometer detection information is obtained after Kalman filtering processing.
In step S33, the gyroscope-fused-attitude quaternion and the geomagnetism-accelerometer-fused-attitude quaternion are linearly fused by thirteen equations to obtain a corrected attitude quaternion
Figure BDA0002728668780000144
Figure BDA0002728668780000145
Wherein the quaternion divergence rate beta of the gyroscope measurement error is 0.0029.
Quaternion of the corrected attitude
Figure BDA0002728668780000146
The direction of the lens is transmitted to a PID controller, and then the steering of the lens is realized, wherein the parameters of the PID controller are as follows: kp 60, Ki 0.2, and Kd 8.
Comparative example 1
The simulation experiment in example 1 was repeated except that the lens rotation angular rate parameter was corrected without providing a parameter correction submodule, and the PID controller controlled the rotation of the electro-optic pod lens with the lens rotation angular rate parameter as an input amount without correction.
Experimental example 1
In example 1 and comparative example 1, the target recognition success rate within 10 minutes after the unmanned aerial vehicle finds the target is recorded, and the target recognition rate is the ratio of the number of images with the target recognized by the image recognition tracking sub-module to the total number of images transmitted by the optoelectronic pod.
The results are shown in table one:
watch 1
Figure BDA0002728668780000151
As can be seen from table one, the ratio of successfully identifying the target in the embodiment 1 is obviously higher than that in the comparative example 1, which shows that by correcting the lens rotation angle rate parameter, the rotation stability of the lens of the optoelectronic pod is improved, the lens shake is greatly reduced, the shot image is clearer, and the image identification and tracking sub-module is more convenient to identify the target therein.
In the description of the present invention, it should be noted that the terms "upper", "lower", "inner", "outer", "front", "rear", and the like indicate orientations or positional relationships based on operational states of the present invention, and are only used for convenience of description and simplification of description, but do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and thus should not be construed as limiting the present invention. Furthermore, the terms "first," "second," "third," and "fourth" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it should be noted that, unless otherwise specifically stated or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; the connection may be direct or indirect via an intermediate medium, and may be a communication between the two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
The present invention has been described above in connection with preferred embodiments, but these embodiments are merely exemplary and merely illustrative. On the basis of the above, the invention can be subjected to various substitutions and modifications, and the substitutions and the modifications are all within the protection scope of the invention.

Claims (4)

1. An unmanned aerial vehicle airborne vision intelligent processing method comprises the following steps:
s1, collecting an image;
s2, identifying the target in the image and outputting a lens rotation angle rate parameter;
s3, correcting the rotation angle rate parameter of the lens, and controlling the rotation of the photoelectric pod lens through a PID controller;
the unmanned aerial vehicle is provided with a photoelectric pod, the photoelectric pod is provided with an inertia measurement unit and is used for detecting attitude information of the photoelectric pod, and the photoelectric pod comprises a gyroscope, a geomagnetic meter and an accelerometer;
in step S3, the correcting the lens rotation angular rate parameter is performed by fusing information detected by an inertial measurement unit including gyroscope detection information, geomagnetism detection information, and speedometer detection information with the lens rotation angular rate parameter;
step S3 includes the following substeps:
s31, fusing gyroscope detection information to obtain a gyroscope fusion attitude quaternion;
s32, fusing geomagnetic meter detection information and speedometer detection information to obtain a geomagnetic meter and accelerometer fusion attitude quaternion;
and S33, linearly fusing the gyroscope fusion attitude quaternion and the geomagnetism and accelerometer fusion attitude quaternion to finish the correction, and transmitting the corrected attitude quaternion to the PID controller to further realize the steering of the lens.
2. The intelligent processing method for onboard vision of unmanned aerial vehicle according to claim 1,
in step S1, the capturing the image is to capture the image through the optoelectronic pod and transfer the captured image to the image recognition tracking sub-module.
3. The intelligent processing method for onboard vision of unmanned aerial vehicle according to claim 1,
in step S2, the image is identified by the image identification and tracking submodule, which is provided with a neural network, and the target is identified by the neural network, and the rotation angular rate parameter of the electro-optic pod lens when the target is moved to the middle position of the image is calculated.
4. The intelligent processing method for onboard vision of unmanned aerial vehicle according to claim 1,
in step S31, the fused gyroscope attitude quaternion after the fusion of the gyroscopes may be expressed as:
Figure FDA0003404362640000021
wherein the content of the first and second substances,
Figure FDA0003404362640000022
an attitude quaternion estimation value representing the rotation angle rate of the lens at the time of t-1 is obtained by filtering the attitude quaternion representing the rotation angle rate parameter of the lens through a Kalman filter,
Figure FDA0003404362640000023
is the differential of the attitude quaternion, delta t is the detection period of the gyroscope, the differential of the attitude quaternion
Figure FDA0003404362640000024
Can be obtained by the following formula:
Figure FDA0003404362640000025
Sωtangular velocity components of the respective axes in the information are detected for the gyroscope.
CN202011111242.XA 2020-10-16 2020-10-16 Unmanned aerial vehicle airborne vision intelligent processing system and method Active CN113141459B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011111242.XA CN113141459B (en) 2020-10-16 2020-10-16 Unmanned aerial vehicle airborne vision intelligent processing system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011111242.XA CN113141459B (en) 2020-10-16 2020-10-16 Unmanned aerial vehicle airborne vision intelligent processing system and method

Publications (2)

Publication Number Publication Date
CN113141459A CN113141459A (en) 2021-07-20
CN113141459B true CN113141459B (en) 2022-04-05

Family

ID=76809740

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011111242.XA Active CN113141459B (en) 2020-10-16 2020-10-16 Unmanned aerial vehicle airborne vision intelligent processing system and method

Country Status (1)

Country Link
CN (1) CN113141459B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114593710B (en) * 2022-03-04 2024-02-06 四川傲势科技有限公司 Unmanned aerial vehicle measurement method, unmanned aerial vehicle measurement system, electronic equipment and medium
PL442945A1 (en) * 2022-11-24 2024-05-27 Enprom Spółka Z Ograniczoną Odpowiedzialnością Method of point monitoring and system for point monitoring of objects, especially power poles

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102355574A (en) * 2011-10-17 2012-02-15 上海大学 Image stabilizing method of airborne tripod head moving target autonomous tracking system
JP2013198118A (en) * 2012-03-22 2013-09-30 Toshiba Corp Tracking device
CN106053874A (en) * 2015-04-01 2016-10-26 鹦鹉无人机股份有限公司 Drone provided with a vertical-view video camera compensated for the instantaneous rotations for estimation of the horizontal speeds
CN110824453A (en) * 2020-01-10 2020-02-21 四川傲势科技有限公司 Unmanned aerial vehicle target motion estimation method based on image tracking and laser ranging

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102355574A (en) * 2011-10-17 2012-02-15 上海大学 Image stabilizing method of airborne tripod head moving target autonomous tracking system
JP2013198118A (en) * 2012-03-22 2013-09-30 Toshiba Corp Tracking device
CN106053874A (en) * 2015-04-01 2016-10-26 鹦鹉无人机股份有限公司 Drone provided with a vertical-view video camera compensated for the instantaneous rotations for estimation of the horizontal speeds
CN110824453A (en) * 2020-01-10 2020-02-21 四川傲势科技有限公司 Unmanned aerial vehicle target motion estimation method based on image tracking and laser ranging

Also Published As

Publication number Publication date
CN113141459A (en) 2021-07-20

Similar Documents

Publication Publication Date Title
CN108351574B (en) System, method and apparatus for setting camera parameters
CN108883825B (en) System and method for unmanned aerial vehicle transport and data acquisition
CN108769531B (en) Method for controlling shooting angle of shooting device, control device and remote controller
CN106525074B (en) A kind of compensation method, device, holder and the unmanned plane of holder drift
WO2017096548A1 (en) Systems and methods for auto-return
WO2018178756A1 (en) System and method for providing autonomous photography and videography
CN113141459B (en) Unmanned aerial vehicle airborne vision intelligent processing system and method
CN107000840A (en) Unmanned vehicle and many mesh imaging systems
CN109254587B (en) Small unmanned aerial vehicle capable of stably hovering under wireless charging condition and control method thereof
US20180022472A1 (en) Autonomous system for taking moving images from a drone, with target tracking and improved target location
CN203845021U (en) Panoramic aerial photographic unit system for aircrafts
WO2021217371A1 (en) Control method and apparatus for movable platform
WO2021043214A1 (en) Calibration method and device, and unmanned aerial vehicle
US20200145568A1 (en) Electro-optical imager field of regard coverage using vehicle motion
WO2020048365A1 (en) Flight control method and device for aircraft, and terminal device and flight control system
CN113795805A (en) Flight control method of unmanned aerial vehicle and unmanned aerial vehicle
CN114604439B (en) Aerial photography video image stabilization system for flapping wing flying robot
WO2020038720A1 (en) Apparatus, method and computer program for detecting the form of a deformable object
CN112950671A (en) Real-time high-precision parameter measurement method for moving target by unmanned aerial vehicle
CN109143303A (en) Flight localization method, device and fixed-wing unmanned plane
WO2018024239A1 (en) Hybrid image stabilization system
CN110337668A (en) Image stability augmentation method and apparatus
CN113063401A (en) Unmanned aerial vehicle aerial survey system
WO2020168519A1 (en) Camera parameter adjusting method, camera device, and movable platform
CN110209199A (en) A kind of farmland fire source monitoring UAV system design

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant