CN110081982B - Unmanned aerial vehicle target positioning method based on double-spectrum photoelectric search - Google Patents

Unmanned aerial vehicle target positioning method based on double-spectrum photoelectric search Download PDF

Info

Publication number
CN110081982B
CN110081982B CN201910179876.XA CN201910179876A CN110081982B CN 110081982 B CN110081982 B CN 110081982B CN 201910179876 A CN201910179876 A CN 201910179876A CN 110081982 B CN110081982 B CN 110081982B
Authority
CN
China
Prior art keywords
target
unmanned aerial
aerial vehicle
infrared
visible light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910179876.XA
Other languages
Chinese (zh)
Other versions
CN110081982A (en
Inventor
蔡宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Forestry Star Beijing Technology Information Co ltd
Original Assignee
China Forestry Star Beijing Technology Information Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Forestry Star Beijing Technology Information Co ltd filed Critical China Forestry Star Beijing Technology Information Co ltd
Priority to CN201910179876.XA priority Critical patent/CN110081982B/en
Publication of CN110081982A publication Critical patent/CN110081982A/en
Application granted granted Critical
Publication of CN110081982B publication Critical patent/CN110081982B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0022Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0077Imaging

Abstract

The invention discloses an unmanned aerial vehicle target positioning method based on double-spectrum photoelectric search, which utilizes an infrared-visible camera to carry out combined calibration to obtain an external parameter matrix between coordinate systems of the infrared-visible camera; searching in a flyback manner according to the set range; detecting a target, namely if the target is detected in the infrared image and the visible image simultaneously, determining the target as a true target, and entering the next step; otherwise, returning to the previous step if the target is not searched; rotating the holder to enable the target to be positioned at the center of the infrared image, and recording the infrared image and the visible light image at the moment; calculating the accurate spatial position of the target according to the information such as the external parameter matrix and the like; sampling according to a fixed time interval, repeating the fourth step to the sixth step to obtain the spatial position of the sampling position, and estimating the motion direction and the speed of the target. The invention overcomes the defects of radar and radio detection modes, and avoids the problems that the thermal infrared imaging is fuzzy and the effect is difficult to meet the requirement in the process of searching for angle change in the radar and thermal imaging mode.

Description

Unmanned aerial vehicle target positioning method based on double-spectrum photoelectric search
Technical Field
The invention relates to an unmanned aerial vehicle target positioning method, in particular to an unmanned aerial vehicle target positioning method utilizing infrared and visible light double-spectrum photoelectric search.
Background
Nowadays, unmanned aerial vehicles (low, small and slow targets) are widely applied in various fields, and safety problems related to the unmanned aerial vehicles are gradually paid attention to people. At present, there are two main general low, small and slow target detection means: radar or radio detection systems are employed.
The first is the detection of low, small, slow objects using radar detection systems. When the radar is adopted for detection and identification, a detection blind area exists, the radar cannot identify unmanned aerial vehicles around the high-rise building, and the radar is adopted for detection, so that the requirements of most application environments are difficult to realize, and the influence of various external interference environments is difficult to avoid. Meanwhile, radar detection needs to actively generate radio frequency power, microwave pollution is caused to the outside, and meanwhile, targets close to ground buildings are difficult to detect.
The second is the detection of small and slow objects by radio detection. The method has high false alarm probability and the false alarms are difficult to distinguish.
Both of the above two methods are difficult to directly identify the target type, and an image processing mode is required for discrimination. The detection based on infrared photoelectricity can realize all-weather work, and has good adaptability in severe weather such as night, cloud, fog, haze and the like. Compared with radar and laser imaging technologies, thermal infrared imaging is lower in cost and higher in monitoring reliability, and photoelectric detection does not emit signals and has strong concealment. Meanwhile, the shape of the target can be clearly displayed through infrared thermal imaging, so that the interference of camouflage, false targets and the like can be found through an image recognition technology, and the infrared thermal imaging system can adapt to a complex environment.
The existing unmanned aerial vehicle monitoring system adopts radar or radio detection as a main means of target detection, and a photoelectric detection system is used as an auxiliary discrimination system. The radar mainly adopts pulse Doppler radar, only can detect and locate targets in motion, and cannot be used for the multi-rotor unmanned aerial vehicle with hovering capability. The radio detection and the conventional independent infrared detection system do not have independent accurate positioning capability.
Disclosure of Invention
The invention aims to provide an unmanned aerial vehicle (low, small and slow targets) detection, accurate positioning and motion state analysis method based on infrared and visible light double-spectrum photoelectric search, and overcomes the defects of a traditional radar, radio or radar and thermal infrared anti-unmanned aerial vehicle system.
In order to achieve the technical purpose, the technical scheme adopted by the invention is as follows:
an unmanned aerial vehicle target positioning method based on double-spectrum photoelectric search comprises the following steps:
firstly, calibrating a camera, namely performing combined calibration by using an infrared-visible light camera to obtain an external parameter matrix between an infrared camera coordinate system and a visible light camera coordinate system, and recording the external parameter matrix as [ R t ];
setting a photoelectric searching range, wherein the searching range comprises an azimuth angle and a pitch angle, the azimuth angle is a rotation angle amplitude in the horizontal direction, the pitch angle is a rotation angle amplitude in the vertical direction, and the interval between the azimuth angle and the pitch angle is larger than the field angle of the infrared thermal imaging system;
thirdly, photoelectric searching, namely, according to the searching range set in the second step, searching is started in a flyback mode according to the azimuth angle and the pitch angle;
fourthly, rapid infrared and visible light unmanned aerial vehicle target detection is carried out, if the unmanned aerial vehicle target is detected in the infrared image and the visible light image simultaneously, the unmanned aerial vehicle target is determined to be a true target, and the step five is carried out; otherwise, determining that the target is not searched, and switching the holder from one search angle to the next search angle for searching again, namely returning to the step three;
fifthly, after the infrared + visible light images detect the unmanned aerial vehicle target, rotating the holder to enable the unmanned aerial vehicle target to be located at the center of the infrared image, and recording the infrared image and the visible light image at the moment;
positioning the unmanned aerial vehicle, and calculating the accurate spatial position of the target of the unmanned aerial vehicle according to an external parameter matrix, camera internal parameters, image resolution and field angle information among the infrared-visible cameras;
analyzing the motion state, sampling according to a fixed time interval, repeating the four-six operation steps to obtain the spatial position information of the sampling position, and further estimating the motion direction and the speed of the unmanned aerial vehicle.
As an optimization of the unmanned aerial vehicle target positioning method based on the double-spectrum photoelectric search, in the third step, when the photoelectric search is performed, the pitch angle is increased after the azimuth angle reaches the maximum angle, so that the photoelectric search tracking system performs the search from the path from the maximum azimuth angle to the minimum azimuth angle.
As another preferable selection of the unmanned aerial vehicle target positioning method based on the double-spectrum photoelectric search, the specific method for performing the rapid infrared + visible light unmanned aerial vehicle target detection in the fourth step comprises the following steps:
step1, marking a large number of infrared images containing unmanned aerial vehicles and not containing the unmanned aerial vehicles, and training an infrared full convolution network FCN model based on deep learning; marking a large number of visible light images containing unmanned aerial vehicles and not containing the unmanned aerial vehicles, and training a visible light Full Convolution Network (FCN) model based on deep learning;
step2, acquiring an infrared image and a visible light image of the current search angle position;
step3, detecting the obtained infrared image by using the trained infrared full convolution network FCN model, if the unmanned aerial vehicle target is detected, determining the target as a suspected target, otherwise, not detecting the suspected target;
step4, the cloud deck is rotated, the suspected target is translated to the center of the infrared image, visible light full convolution network FCN model detection is carried out in the corresponding area of the visible light image according to the mapping relation between the infrared image and the visible light image, if the unmanned aerial vehicle target is detected, the suspected target is determined to be a reasonable target, otherwise, the suspected target is a false target;
step5, if the rationality target can be detected by n continuous frames, the target is a true target, otherwise the target is a false target, wherein n represents the number of image frames shot by the photoelectric search tracking system each time the photoelectric search tracking system stops, and is greater than 1.
If the Step6 is a true target, entering a timing sampling mode, and further analyzing the motion state of the target; and if the target is not detected, the cradle head rotates to the next search angle to carry out the next detection process.
As another optimization of the unmanned aerial vehicle target positioning method based on double-spectrum photoelectric search, n is in a value range of 3-5.
As another optimization of the unmanned aerial vehicle target positioning method based on the double-spectrum photoelectric search, disclosed by the invention, an infrared-visible light camera system is calibrated, and internal parameters of an infrared camera and internal parameters of a visible light camera are also obtained.
As another preferable selection of the unmanned aerial vehicle target positioning method based on the double-spectrum photoelectric search in the present invention, in the first step, an external parameter matrix between the coordinate system of the mid-infrared camera and the coordinate system of the visible light camera is:
Figure GDA0002776962650000031
as still another preferable selection of the unmanned aerial vehicle target positioning method based on the double-spectrum photoelectric search, the specific method for positioning the unmanned aerial vehicle in the sixth step comprises the following steps:
(1) assuming that an infrared camera coordinate system with an infrared camera optical center as an origin O and an optical axis as a Z axis is O-XYZ, a visible light camera coordinate system with a visible light camera optical center as an origin O 'and an optical axis as a Z' axis is O '-X' Y 'Z', and an infrared camera coordinate system is a reference coordinate system, a linear equation of the infrared camera optical axis is
LirX-y-0 formula 2
According to the camera calibration, the conversion matrix for converting the infrared camera coordinate system into the visible light camera coordinate system is [ R t ], the conversion matrix for converting the infrared camera optical axis into the visible light camera optical axis is also [ R t ], so that the linear equation of the visible light camera optical axis is
Figure GDA0002776962650000032
(2) When the unmanned aerial vehicle is searched, the holder is rotated to enable the unmanned aerial vehicle to be positioned at the positive center P of the infrared image, and at the moment, a point with the minimum sum of the distances between the point and an optical axis line OP of the infrared camera and the distance between the point and the straight line O' P is obtained and used as the space position of the target of the unmanned aerial vehicle;
(3) calculating the relative position of the target according to the coordinate of the unmanned aerial vehicle target in the infrared camera reference coordinate system, converting the coordinate into GPS and elevation information of the target by combining the GPS coordinate and elevation information of the infrared camera, and transmitting the GPS and elevation information to a Geographic Information System (GIS);
(4) and (3) acquiring images at fixed intervals, and repeating the steps (1) to (3) to obtain the position information of the unmanned aerial vehicle target at each sampling time point, so as to estimate the movement direction and speed of the target.
As still another preferable selection of the unmanned aerial vehicle target positioning method based on the double-spectrum photoelectric search of the present invention, in the step (2), the method of finding the point having the minimum sum of the distances from the infrared camera optical axis line OP and the line O' P is specifically as follows:
when the cradle head is rotated to enable the unmanned aerial vehicle to be positioned at the center of the infrared image, the corresponding infrared image and the visible light image are recorded, and according to the visible light image, the deviation center of the unmanned aerial vehicle target on the visible light image is the column direction nxPixel and row direction nyThe pixels can be known that the corresponding actual distances are Δ x and Δ y, respectively, and assuming that the width of the visible light camera CCD core is w and the resolution is h, then
Figure GDA0002776962650000041
The target of the unmanned aerial vehicle is at the center P of the infrared image, so that the optical axis line L of the infrared camerairOP; in the visible light image, according to the imaging principle of the camera, there are:
Figure GDA0002776962650000042
Figure GDA0002776962650000043
wherein alpha and beta are angles of rotation of O 'P around the Y axis and the X axis respectively by taking O' as the center, and the visible light optical axis L can be obtained after rotating the alpha and beta angles in a three-dimensional space according to the imaging relation of a visible light cameravl
Assuming that the coordinate of a certain point on the O' P straight line is (x, y, Z) in the infrared camera reference coordinate system, the point is rotated to the Z axis of the visible camera reference coordinate system (x)1,y1,z1) Then passes through a transformation matrix [ R t ]]Then obtaining a point (0,0, Z) on the Z axis of the infrared camera coordinate system0) Then, there are:
rotate by an angle beta around the X axis to obtain
Figure GDA0002776962650000044
Rotate by an angle alpha around the Y axis to obtain
Figure GDA0002776962650000045
Namely:
Figure GDA0002776962650000046
meanwhile, according to equation 4, there are:
Figure GDA0002776962650000051
namely:
Figure GDA0002776962650000052
the general equation for obtaining the line O' P is:
Figure GDA0002776962650000053
substituting into equations 5 and 6, and converting them into point-to-point equations:
Figure GDA0002776962650000054
and (3) solving a point with the minimum sum of the distances between the point and the infrared camera optical axis line OP and the distance between the point and the line O' P as the position of the unmanned aerial vehicle target:
assuming that the coordinates of the unmanned aerial vehicle target are (x, y, z), the distance between the unmanned aerial vehicle target and the optical axis line OP of the infrared camera is
Figure GDA0002776962650000055
The distance to the straight line O' P is obtained as follows:
let the coordinate of the vertical point of the drone target (x, y, z) to the line O' P be (x)c,yc,zc) Because the vertical point is on a straight line, therefore
Figure GDA0002776962650000056
The deformation is as follows:
Figure GDA0002776962650000057
and has a perpendicular direction vector (x-x)c,y-yc,z-zc) The product of the sum of the linear direction vectors (M, N, L) is equal to 0, i.e.
M*(x-xc)+N*(y-yc)+L*(z-zc) Equation 9 where 0 is not satisfied
Substituting equation 8 into equation 9, eliminatingKnown number xc,yc,zcObtaining an expression of the parameter C:
Figure GDA0002776962650000058
the distance from a point to the straight line O' P is the point and the perpendicular point (x)c,yc,zc) The distance of (c):
Figure GDA0002776962650000059
wherein xc,yc,zcElimination can be carried over into equation 8 and equation 10,
therefore, the minimum value is obtained by finding the point with the minimum sum of the distances between the line OP and the line O' P on the optical axis of the infrared camera
Figure GDA0002776962650000061
And obtaining the value of the minimum time point (x, y, z), namely the space position of the unmanned aerial vehicle target.
Has the advantages that:
1. the photoelectric search detection mode does not need target motion characteristics required by traditional photoelectric detection or radar detection; the target detection result can be timely fed back to the system to enable the turntable to continue searching or switch to a motion analysis mode, so that the rapid start and stop of the holder are combined with the target image detection, the target type can be directly identified in an image processing mode, the defects of a radar and radio detection mode are overcome, and the problems that the thermal infrared imaging can be fuzzy and the effect is difficult to meet the requirement in the process of searching angle change in a radar and thermal imaging mode are solved.
2. In the process of detecting the target image, the method adopts a Full Convolution Network (FCN) model algorithm based on deep learning to judge the true and false target of the target, and has high reliability.
3. Compared with the traditional mode, the method can realize quick and accurate detection and positioning of the remote infrared small target, has strong anti-noise capability and high tolerance to the environment, can normally work under severe conditions, and has flexible deployment, convenient installation and strong anti-interference concealment.
4. The motion state of the unmanned aerial vehicle target is further judged, and the unmanned aerial vehicle anti-unmanned aerial vehicle task guiding method has reference significance for guiding the anti-unmanned aerial vehicle task.
Drawings
The invention is further illustrated by the non-limiting examples given in the accompanying drawings;
FIG. 1 is a flow chart of the system operation of the present invention;
FIG. 2 is a flow chart of a target detection algorithm of the present invention;
FIG. 3 is a diagram of the positional relationship of the unmanned aerial vehicle target, the infrared camera and the visible light camera;
FIG. 4 is a schematic view of an infrared and visible light image of a target of an unmanned aerial vehicle;
FIG. 5 is a schematic diagram of two-dimensional imaging of a camera;
FIG. 6 is a schematic diagram of visible light camera three-dimensional imaging;
Detailed Description
In order that those skilled in the art can better understand the present invention, the following technical solutions are further described with reference to the accompanying drawings and examples.
The embodiment of the invention provides a working method of an anti-unmanned aerial vehicle system, namely a method for quickly searching, positioning and detecting a motion state of an unmanned aerial vehicle target, wherein the working flow chart is shown in figure 1, and the specific working steps are as follows:
firstly, calibrating a camera, namely performing combined calibration by using an infrared-visible light camera to obtain an external parameter matrix between an infrared camera coordinate system and a visible light camera coordinate system, and recording the external parameter matrix as [ R t ];
secondly, the system enters a photoelectric search mode, a search range is set, the search range refers to a search azimuth angle and pitch angle range, the azimuth angle is a rotation angle amplitude in the horizontal direction, the pitch angle is a rotation angle amplitude in the vertical direction, and the range of the azimuth angle and the pitch angle is larger than a field angle of the infrared thermal imaging system;
thirdly, photoelectric searching, namely, according to the searching range set in the second step, starting searching according to a direction (pitching direction) retrace mode; when the azimuth angle reaches the maximum angle, the pitch angle is increased, so that the path from the maximum azimuth angle to the minimum azimuth angle of the photoelectric searching and tracking system is searched;
fourthly, rapid infrared and visible light unmanned aerial vehicle target detection is carried out, if the unmanned aerial vehicle target is detected in the infrared image and the visible light image simultaneously, the unmanned aerial vehicle target is determined to be a true target, and the step five is carried out; otherwise, determining that the target is not searched, and switching the holder from one search angle to the next search angle for searching again, namely returning to the step three;
fifthly, after the infrared + visible light images detect the unmanned aerial vehicle target, rotating the holder to enable the unmanned aerial vehicle target to be located at the center of the infrared image, and recording the infrared image and the visible light image at the moment;
positioning the unmanned aerial vehicle, and calculating the accurate spatial position of the target of the unmanned aerial vehicle according to an external parameter matrix, camera internal parameters, image resolution and field angle information among the infrared-visible cameras;
analyzing the motion state, sampling according to a fixed time interval (such as 100ms), repeating the four-six operation steps, obtaining the spatial position information of the sampling position, and further predicting the motion direction and the speed of the unmanned aerial vehicle.
Specifically, the invention further provides a method for rapidly detecting the target of the unmanned aerial vehicle, the flow of which is shown in fig. 2, and the method comprises the following specific steps:
step1, marking a large number of infrared images containing unmanned aerial vehicles and not containing the unmanned aerial vehicles, and training an infrared full convolution network FCN model based on deep learning; marking a large number of visible light images containing unmanned aerial vehicles and not containing the unmanned aerial vehicles, and training a visible light Full Convolution Network (FCN) model based on deep learning;
step2, acquiring an infrared image and a visible light image of the current search angle position;
step3, detecting the obtained infrared image by using the trained infrared full convolution network FCN model, if the unmanned aerial vehicle target is detected, determining the target as a suspected target, otherwise, not detecting the suspected target;
step4, the cloud deck is rotated, the suspected target is translated to the center of the infrared image, visible light full convolution network FCN model detection is carried out in the corresponding area of the visible light image according to the mapping relation between the infrared image and the visible light image, if the unmanned aerial vehicle target is detected, the suspected target is determined to be a reasonable target, otherwise, the suspected target is a false target;
step5, if the rationality target can be detected by n continuous frames, the target is a true target, otherwise the target is a false target, wherein n represents the number of image frames shot by the photoelectric search tracking system every time the photoelectric search tracking system stays, and the value range is 3-5.
If the Step6 is a true target, entering a timing sampling mode, and further analyzing the motion state of the target; and if the target is not detected, the cradle head rotates to the next search angle to carry out the next detection process.
The embodiment also provides an unmanned aerial vehicle positioning method, which comprises the following steps and principles:
(1) and calibrating the infrared-visible light camera system to obtain the internal parameters of the infrared camera, the internal parameters of the visible light camera and the external parameters of the infrared-visible light camera. The extrinsic parameter matrix is noted as:
Figure GDA0002776962650000081
(2) assuming that an infrared camera coordinate system with an infrared camera optical center as an origin O and an optical axis as a Z axis is O-XYZ, a visible light camera coordinate system with a visible light camera optical center as an origin O 'and an optical axis as a Z' axis is O '-X' Y 'Z', and an infrared camera coordinate system is a reference coordinate system, a linear equation of the infrared camera optical axis is
LirX-y-0 formula 2
According to the camera calibration, the conversion matrix for converting the infrared camera coordinate system to the visible light camera coordinate system is [ R t ]]The conversion matrix for converting the optical axis of the infrared camera into the optical axis of the visible light camera is also [ R t ]]Let a point (0,0, z) on the optical axis of the infrared camera0) After the transformation, the coordinate becomes (x)1,y1,z1) Then there is
Figure GDA0002776962650000082
Namely, it is
Figure GDA0002776962650000083
Elimination of z0To obtain
Figure GDA0002776962650000084
Therefore, the linear equation (point-to-point equation) of the optical axis of the visible light camera is
Figure GDA0002776962650000085
(3) When the unmanned aerial vehicle is searched, the holder is rotated to enable the unmanned aerial vehicle to be positioned at the positive center P of the infrared image, the position relation among the target of the unmanned aerial vehicle, the infrared camera and the visible light camera is shown in figure 3, equipment errors and calculation errors are not considered, and theoretically, the coordinate of the unmanned aerial vehicle can be obtained only by obtaining the intersection point of the optical axis straight line of the infrared camera and the O' P straight line, so that positioning is realized; however, in practice, errors exist in the operation and calculation processes of camera calibration, pan-tilt rotation and the like, so that the two spatial straight lines are not necessarily intersected. And at the moment, the point with the minimum sum of the distances between the point and the infrared camera optical axis line OP and the distance between the point and the infrared camera optical axis line O' P is obtained and used as the space position of the unmanned aerial vehicle target.
The specific calculation process is as follows:
when the holder is rotated to enable the unmanned aerial vehicle to be positioned at the center of the infrared image, the corresponding infrared image and the visible light image are recorded, as shown in fig. 4, according to the visible light image, the deviation center of the unmanned aerial vehicle target on the visible light image is the column direction nxPixel and row direction nyThe pixels can be known that the corresponding actual distances are Δ x and Δ y, respectively, and assuming that the width of the visible light camera CCD core is w and the resolution is h, then
Figure GDA0002776962650000091
The target of the unmanned aerial vehicle is at the center P of the infrared image, so that the optical axis line L of the infrared camerairOP; in the visible light image, according to the imaging principle of the camera (the schematic diagram of the two-dimensional imaging principle in the X direction is shown in fig. 5), there are:
Figure GDA0002776962650000092
Figure GDA0002776962650000093
in three-dimensional space, the visible light camera imaging relationship is shown in fig. 6, and it can be seen that: the optical axis L of the visible light can be obtained by rotating the O 'P around the optical center O' of the visible light camera by an angle beta around the X axis and then by an angle alpha around the Y axisvl
Assuming that the coordinate of a certain point on the O' P straight line is (x, y, Z) in the infrared camera reference coordinate system, the point is rotated to the Z axis of the visible camera reference coordinate system (x)1,y1,z1) Then passes through a transformation matrix [ R t ]]Then obtaining a point (0,0, Z) on the Z axis of the infrared camera coordinate system0) Then, there are:
rotate by an angle beta around the X axis to obtain
Figure GDA0002776962650000094
Rotate by an angle alpha around the Y axis to obtain
Figure GDA0002776962650000095
Namely:
Figure GDA0002776962650000096
meanwhile, according to equation 4, there are:
Figure GDA0002776962650000101
namely:
Figure GDA0002776962650000102
the general equation for obtaining the line O' P is:
Figure GDA0002776962650000103
substituting into equations 5 and 6, and converting them into point-to-point equations:
Figure GDA0002776962650000104
theoretically, the intersection point of the infrared camera optical axis line OP and the line O' P is obtained, and the intersection point is the position of the current unmanned aerial vehicle target. However, because there are errors in the operations such as camera calibration, pan-tilt rotation, etc. and in the above calculation steps, here, the point with the minimum sum of the distances from the infrared camera optical axis line OP and the line O' P is obtained as the position of the target of the unmanned aerial vehicle:
assuming that the coordinates of the unmanned aerial vehicle target are (x, y, z), the distance between the unmanned aerial vehicle target and the optical axis line OP of the infrared camera is
Figure GDA0002776962650000105
The distance to the straight line O' P is obtained as follows:
let the coordinate of the vertical point of the drone target (x, y, z) to the line O' P be (x)c,yc,zc) Because the vertical point is on a straight line, therefore
Figure GDA0002776962650000106
The deformation is as follows:
Figure GDA0002776962650000107
and has a perpendicular direction vector (x-x)c,y-yc,z-zc) The product of the sum of the linear direction vectors (M, N, L) is equal to 0, i.e.
M*(x-xc)+N*(y-yc)+L*(z-zc) Equation 9 where 0 is not satisfied
Substituting equation 8 into equation 9, eliminating the unknown xc,yc,zcObtaining an expression of the parameter C:
Figure GDA0002776962650000108
the distance from a point to the straight line O' P is the point and the perpendicular point (x)c,yc,zc) The distance of (c):
Figure GDA0002776962650000111
wherein xc,yc,zcElimination can be carried over into equation 8 and equation 10,
therefore, the minimum value is obtained by finding the point with the minimum sum of the distances between the line OP and the line O' P on the optical axis of the infrared camera
Figure GDA0002776962650000112
And obtaining the value of the minimum time point (x, y, z), namely the space position of the unmanned aerial vehicle target.
(4) Calculating the relative position of the target according to the coordinate of the unmanned aerial vehicle target in the infrared camera reference coordinate system, converting the coordinate into GPS and elevation information of the target by combining the GPS coordinate and elevation information of the infrared camera, and transmitting the GPS and elevation information to a Geographic Information System (GIS);
(5) and (3) acquiring images at fixed intervals, and repeating the steps (2) to (4) to obtain the position information of the unmanned aerial vehicle target at each sampling time point, so as to estimate the movement direction and speed of the target.
Taking the reference coordinate system of the infrared camera in the step (3) as an example, if the time point t is sampled1The position of the unmanned aerial vehicle target is
Figure GDA0002776962650000113
The last sampling point is set as
Figure GDA0002776962650000114
The estimated value of the direction of motion is
Figure GDA0002776962650000115
The estimated value of the movement speed is further converted into movement data under a GIS system.
The unmanned aerial vehicle target positioning method based on the double-spectrum photoelectric search provided by the invention is described in detail above. The description of the specific embodiments is only intended to facilitate an understanding of the method of the invention and its core ideas. It should be noted that, for those skilled in the art, it is possible to make various improvements and modifications to the present invention without departing from the principle of the present invention, and those improvements and modifications also fall within the scope of the claims of the present invention.

Claims (8)

1. An unmanned aerial vehicle target positioning method based on double-spectrum photoelectric search is characterized by comprising the following steps:
firstly, calibrating a camera, namely performing combined calibration by using an infrared-visible light camera to obtain an external parameter matrix between an infrared camera coordinate system and a visible light camera coordinate system, and recording the external parameter matrix as [ Rt ];
setting a photoelectric searching range, wherein the searching range comprises an azimuth angle and a pitch angle, the azimuth angle is a rotation angle amplitude in the horizontal direction, the pitch angle is a rotation angle amplitude in the vertical direction, and the interval between the azimuth angle and the pitch angle is larger than the field angle of the infrared thermal imaging system;
thirdly, photoelectric searching, namely, according to the searching range set in the second step, searching is started in a flyback mode according to the azimuth angle and the pitch angle;
fourthly, rapid infrared and visible light unmanned aerial vehicle target detection is carried out, if the unmanned aerial vehicle target is detected in the infrared image and the visible light image simultaneously, the unmanned aerial vehicle target is determined to be a true target, and the step five is carried out; otherwise, determining that the target is not searched, and switching the holder from one search angle to the next search angle for searching again, namely returning to the step three;
fifthly, after the infrared + visible light images detect the unmanned aerial vehicle target, rotating the holder to enable the unmanned aerial vehicle target to be located at the center of the infrared image, and recording the infrared image and the visible light image at the moment;
positioning the unmanned aerial vehicle, and calculating the accurate spatial position of the target of the unmanned aerial vehicle according to an external parameter matrix, camera internal parameters, image resolution and field angle information among the infrared-visible cameras;
analyzing the motion state, sampling according to a fixed time interval, repeating the four-six operation steps to obtain the spatial position information of the sampling position, and further estimating the motion direction and the speed of the unmanned aerial vehicle.
2. The method of claim 1, wherein in step three, when performing the photoelectric search, the elevation angle is raised when the azimuth reaches the maximum angle, so that the photoelectric search tracking system performs the search from the maximum azimuth to the minimum azimuth.
3. The method for unmanned aerial vehicle target positioning based on double-spectrum photoelectric search as claimed in claim 1, wherein the specific method for fast infrared + visible light unmanned aerial vehicle target detection in step four comprises the following steps:
step1, marking a large number of infrared images containing unmanned aerial vehicles and not containing the unmanned aerial vehicles, and training an infrared full convolution network FCN model based on deep learning; marking a large number of visible light images containing unmanned aerial vehicles and not containing the unmanned aerial vehicles, and training a visible light Full Convolution Network (FCN) model based on deep learning;
step2, acquiring an infrared image and a visible light image of the current search angle position;
step3, detecting the obtained infrared image by using the trained infrared full convolution network FCN model, if the unmanned aerial vehicle target is detected, determining the target as a suspected target, otherwise, not detecting the suspected target;
step4, the cloud deck is rotated, the suspected target is translated to the center of the infrared image, visible light full convolution network FCN model detection is carried out in the corresponding area of the visible light image according to the mapping relation between the infrared image and the visible light image, if the unmanned aerial vehicle target is detected, the suspected target is determined to be a reasonable target, otherwise, the suspected target is a false target;
step5, if the rationality target can be detected by n continuous frames, the target is a true target, otherwise the target is a false target, wherein n represents the number of image frames shot by the photoelectric search tracking system every time the photoelectric search tracking system stays, and is more than 1;
if the Step6 is a true target, entering a timing sampling mode, and further analyzing the motion state of the target; and if the target is not detected, the cradle head rotates to the next search angle to carry out the next detection process.
4. The unmanned aerial vehicle target positioning method based on double-spectrum photoelectric search is characterized in that the value range of n is 3-5.
5. The unmanned aerial vehicle target positioning method based on double-spectrum photoelectric search is characterized in that an infrared-visible light camera system is calibrated, and internal parameters of an infrared camera and internal parameters of a visible light camera are obtained.
6. The unmanned aerial vehicle target positioning method based on double-spectrum photoelectric search of claim 5, wherein the extrinsic parameter matrix between the coordinate system of the mid-infrared camera and the coordinate system of the visible light camera in the step one is as follows:
Figure FDA0002776962640000021
7. the method for locating the target of the unmanned aerial vehicle based on the double-spectrum photoelectric search is characterized in that the specific method for locating the unmanned aerial vehicle in the sixth step comprises the following steps:
(1) assuming that an infrared camera coordinate system with an infrared camera optical center as an origin O and an optical axis as a Z axis is O-XYZ, a visible light camera coordinate system with a visible light camera optical center as an origin O 'and an optical axis as a Z' axis is O '-X' Y 'Z', and an infrared camera coordinate system is a reference coordinate system, a linear equation of the infrared camera optical axis is
LirX-y-0 formula 2
According to the calibration of the camera, the conversion matrix for converting the coordinate system of the infrared camera into the coordinate system of the visible light camera is [ Rt ], and then the conversion matrix for converting the optical axis of the infrared camera into the optical axis of the visible light camera is also [ Rt ], so that the linear equation of the optical axis of the visible light camera is
Figure FDA0002776962640000022
(2) When the unmanned aerial vehicle is searched, the holder is rotated to enable the unmanned aerial vehicle to be positioned at the positive center P of the infrared image, and at the moment, a point with the minimum sum of the distances between the point and an optical axis line OP of the infrared camera and the distance between the point and the straight line O' P is obtained and used as the space position of the target of the unmanned aerial vehicle;
(3) calculating the relative position of the target according to the coordinate of the unmanned aerial vehicle target in the infrared camera reference coordinate system, converting the coordinate into GPS and elevation information of the target by combining the GPS coordinate and elevation information of the infrared camera, and transmitting the GPS and elevation information to a Geographic Information System (GIS);
(4) and (3) acquiring images at fixed intervals, and repeating the steps (1) to (3) to obtain the position information of the unmanned aerial vehicle target at each sampling time point, so as to estimate the movement direction and speed of the target.
8. The method for locating the target of the unmanned aerial vehicle based on the double-spectrum photoelectric search according to claim 7,
in the step (2), the method for obtaining the point with the minimum sum of the distances from the straight line OP and the straight line O' P of the optical axis of the infrared camera is specifically as follows:
when the cradle head is rotated to enable the unmanned aerial vehicle to be positioned at the center of the infrared image, the corresponding infrared image and the visible light image are recorded, and according to the visible light image, the deviation center of the unmanned aerial vehicle target on the visible light image is the column direction nxPixel and row direction nyThe pixel canKnowing that the corresponding actual distances are Δ x and Δ y, respectively, assuming that the width of the visible light camera CCD core is w and the resolution is h, then
Figure FDA0002776962640000031
The target of the unmanned aerial vehicle is at the center P of the infrared image, so that the optical axis line L of the infrared camerairOP; in the visible light image, according to the imaging principle of the camera, there are:
Figure FDA0002776962640000032
Figure FDA0002776962640000033
wherein alpha and beta are angles of rotation of O 'P around the Y axis and the X axis respectively by taking O' as the center, and the visible light optical axis L can be obtained after rotating the alpha and beta angles in a three-dimensional space according to the imaging relation of a visible light cameravl
Assuming that the coordinate of a certain point on the O' P straight line is (x, y, Z) in the infrared camera reference coordinate system, the point is rotated to the Z axis of the visible camera reference coordinate system (x)1,y1,z1) Then passes through a transformation matrix [ R t ]]Then obtaining a point (0,0, Z) on the Z axis of the infrared camera coordinate system0) Then, there are:
rotate by an angle beta around the X axis to obtain
Figure FDA0002776962640000034
Rotate by an angle alpha around the Y axis to obtain
Figure FDA0002776962640000035
Namely:
Figure FDA0002776962640000041
meanwhile, according to equation 4, there are:
Figure FDA0002776962640000042
namely:
Figure FDA0002776962640000043
the general equation for obtaining the line O' P is:
Figure FDA0002776962640000044
substituting into equations 5 and 6, and converting them into point-to-point equations:
Figure FDA0002776962640000045
and (3) solving a point with the minimum sum of the distances between the point and the infrared camera optical axis line OP and the distance between the point and the line O' P as the position of the unmanned aerial vehicle target:
assuming that the coordinates of the unmanned aerial vehicle target are (x, y, z), the distance between the unmanned aerial vehicle target and the optical axis line OP of the infrared camera is
Figure FDA0002776962640000046
The distance to the straight line O' P is obtained as follows:
let the coordinate of the vertical point of the drone target (x, y, z) to the line O' P be (x)c,yc,zc) Because the vertical point is on a straight line, therefore
Figure FDA0002776962640000047
The deformation is as follows:
Figure FDA0002776962640000048
and has a perpendicular direction vector (x-x)c,y-yc,z-zc) The product of the sum of the linear direction vectors (M, N, L) is equal to 0, i.e.
M*(x-xc)+N*(y-yc)+L*(z-zc) Equation 9 where 0 is not satisfied
Substituting equation 8 into equation 9, eliminating the unknown xc,yc,zcObtaining an expression of the parameter C:
Figure FDA0002776962640000051
the distance from a point to the straight line O' P is the point and the perpendicular point (x)c,yc,zc) The distance of (c):
Figure FDA0002776962640000052
wherein xc,yc,zcElimination can be carried over into equation 8 and equation 10,
therefore, the minimum value is obtained by finding the point with the minimum sum of the distances between the line OP and the line O' P on the optical axis of the infrared camera
Figure FDA0002776962640000053
And obtaining the value of the minimum time point (x, y, z), namely the space position of the unmanned aerial vehicle target.
CN201910179876.XA 2019-03-11 2019-03-11 Unmanned aerial vehicle target positioning method based on double-spectrum photoelectric search Active CN110081982B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910179876.XA CN110081982B (en) 2019-03-11 2019-03-11 Unmanned aerial vehicle target positioning method based on double-spectrum photoelectric search

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910179876.XA CN110081982B (en) 2019-03-11 2019-03-11 Unmanned aerial vehicle target positioning method based on double-spectrum photoelectric search

Publications (2)

Publication Number Publication Date
CN110081982A CN110081982A (en) 2019-08-02
CN110081982B true CN110081982B (en) 2021-01-15

Family

ID=67412392

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910179876.XA Active CN110081982B (en) 2019-03-11 2019-03-11 Unmanned aerial vehicle target positioning method based on double-spectrum photoelectric search

Country Status (1)

Country Link
CN (1) CN110081982B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111178148B (en) * 2019-12-06 2023-06-02 天津大学 Ground target geographic coordinate positioning method based on unmanned aerial vehicle vision system
CN111627048B (en) * 2020-05-19 2022-07-01 浙江大学 Multi-camera cooperative target searching method
CN111765974B (en) * 2020-07-07 2021-04-13 中国环境科学研究院 Wild animal observation system and method based on miniature refrigeration thermal infrared imager
CN111924101B (en) * 2020-08-31 2024-04-09 金陵科技学院 Unmanned aerial vehicle double-cradle head camera and working method thereof
CN112651347B (en) * 2020-12-29 2022-07-05 嘉兴恒创电力集团有限公司博创物资分公司 Smoking behavior sample generation method and system based on double-spectrum imaging
CN114489148B (en) * 2021-12-30 2023-08-29 中国航天系统科学与工程研究院 Anti-unmanned aerial vehicle system based on intelligent detection and electronic countermeasure
CN115597498B (en) * 2022-12-13 2023-03-31 成都铂贝科技有限公司 Unmanned aerial vehicle positioning and speed estimation method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105004354A (en) * 2015-06-19 2015-10-28 北京航空航天大学 Unmanned aerial vehicle visible light and infrared image target positioning method under large squint angle
CN107167766A (en) * 2017-05-19 2017-09-15 江苏速度电子科技有限公司 UAV system ULTRA-WIDEBAND RADAR fire-fighting alignment system
CN207147634U (en) * 2017-08-16 2018-03-27 上海融军科技有限公司 Unmanned plane positions and lever position indicator

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9285296B2 (en) * 2013-01-02 2016-03-15 The Boeing Company Systems and methods for stand-off inspection of aircraft structures
CN211825673U (en) * 2015-12-07 2020-10-30 菲力尔系统公司 Image forming apparatus with a plurality of image forming units
US20170166322A1 (en) * 2015-12-15 2017-06-15 Dexter Research Center, Inc. Infrared temperature monitoring system for aircraft
CN108052110A (en) * 2017-09-25 2018-05-18 南京航空航天大学 UAV Formation Flight method and system based on binocular vision

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105004354A (en) * 2015-06-19 2015-10-28 北京航空航天大学 Unmanned aerial vehicle visible light and infrared image target positioning method under large squint angle
CN107167766A (en) * 2017-05-19 2017-09-15 江苏速度电子科技有限公司 UAV system ULTRA-WIDEBAND RADAR fire-fighting alignment system
CN207147634U (en) * 2017-08-16 2018-03-27 上海融军科技有限公司 Unmanned plane positions and lever position indicator

Also Published As

Publication number Publication date
CN110081982A (en) 2019-08-02

Similar Documents

Publication Publication Date Title
CN110081982B (en) Unmanned aerial vehicle target positioning method based on double-spectrum photoelectric search
CN110414396B (en) Unmanned ship perception fusion algorithm based on deep learning
CN103065323B (en) Subsection space aligning method based on homography transformational matrix
CN110850403A (en) Multi-sensor decision-level fused intelligent ship water surface target feeling knowledge identification method
CN109737981B (en) Unmanned vehicle target searching device and method based on multiple sensors
CN106405540A (en) Radar and photoelectric device complementation-based detection and identification device and method
CN108596117B (en) Scene monitoring method based on two-dimensional laser range finder array
CN110082783B (en) Cliff detection method and device
CN110568433A (en) High-altitude parabolic detection method based on millimeter wave radar
CN109375211B (en) Radar and multi-optical equipment-based mobile unmanned platform target searching method
CN111753694B (en) Unmanned vehicle target searching system and method
CN104297758A (en) Assistant berthing device and assistant berthing method based on 2D pulse type laser radar
CN112596048B (en) Method for accurately detecting position of low-speed unmanned aerial vehicle through radar photoelectric cooperation
CN113780246B (en) Unmanned aerial vehicle three-dimensional track monitoring method and system and three-dimensional monitoring device
CN104166137A (en) Target comprehensive parameter tracking measurement method based on display of radar warning situation map
CN115032627A (en) Distributed multi-sensor multi-mode unmanned cluster target fusion tracking method
CN115291219A (en) Method and device for realizing dynamic obstacle avoidance of unmanned aerial vehicle by using monocular camera and unmanned aerial vehicle
CN113819881A (en) Fire source distance and map azimuth detection method for reconnaissance and inspection robot
CN117173215A (en) Inland navigation ship whole-course track identification method and system crossing cameras
CN111665509A (en) Intelligent collision-prevention radar
CN116520275A (en) Radar photoelectric integrated method and system for detecting and tracking low-speed small target
CN111402324B (en) Target measurement method, electronic equipment and computer storage medium
CN114973037B (en) Method for intelligently detecting and synchronously positioning multiple targets by unmanned aerial vehicle
CN115035470A (en) Low, small and slow target identification and positioning method and system based on mixed vision
CN116543141A (en) Unmanned aerial vehicle identification and positioning method based on acoustic signal and image fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant