CN105424006A - Unmanned aerial vehicle hovering precision measurement method based on binocular vision - Google Patents

Unmanned aerial vehicle hovering precision measurement method based on binocular vision Download PDF

Info

Publication number
CN105424006A
CN105424006A CN201510736167.9A CN201510736167A CN105424006A CN 105424006 A CN105424006 A CN 105424006A CN 201510736167 A CN201510736167 A CN 201510736167A CN 105424006 A CN105424006 A CN 105424006A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
camera
target
eye image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510736167.9A
Other languages
Chinese (zh)
Other versions
CN105424006B (en
Inventor
王万国
刘俍
刘越
张方正
董罡
雍军
吴观斌
慕世友
傅孟潮
魏传虎
张飞
李建祥
赵金龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
State Grid Intelligent Technology Co Ltd
Original Assignee
State Grid Corp of China SGCC
Electric Power Research Institute of State Grid Shandong Electric Power Co Ltd
Shandong Luneng Intelligence Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by State Grid Corp of China SGCC, Electric Power Research Institute of State Grid Shandong Electric Power Co Ltd, Shandong Luneng Intelligence Technology Co Ltd filed Critical State Grid Corp of China SGCC
Priority to CN201510736167.9A priority Critical patent/CN105424006B/en
Publication of CN105424006A publication Critical patent/CN105424006A/en
Application granted granted Critical
Publication of CN105424006B publication Critical patent/CN105424006B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • G01C11/08Interpretation of pictures by comparison of two or more pictures of the same area the pictures not being supported in the same relative position as when they were taken

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses an unmanned aerial vehicle hovering precision measurement method based on binocular vision. The method comprises the following steps that in a calibrating stage, a camera is calibrated through a Zhang zhengyou chessboard calibration method to determine a calibration parameter to definite a calibration result parameter; in a positioning stage, when unmanned aerial vehicle hovering precision measurement is performed, a sliding rail is placed under an unmanned aerial vehicle hovering point, the binocular camera is fixed on the sliding rail in parallel according to a set distance and can move in the direction of the sliding rail, camera lenses are vertically and upwards placed, the two camera imaging planes are located at the same plane, and the optical axes of the camera imaging planes are parallel to each other; a left-eye camera body and a right-eye camera body collect unmanned aerial vehicle images and transmit the images to a computer; the computer calculates the three-dimensional position coordinates of an unmanned aerial vehicle according to the left-eye images, the right-eye images and the calibration result parameter; after hovering is finished, the hovering precision is calculated according to the three-dimensional track of the unmanned aerial vehicle. The method has the advantages that target detecting, tracking, precise matching and three-dimensional positioning of the unmanned aerial vehicle are achieved.

Description

Unmanned aerial vehicle hovering precision measurement method based on binocular vision
Technical Field
The invention relates to an unmanned aerial vehicle hovering precision measuring method based on binocular vision.
Background
The hovering precision of the unmanned aerial vehicle is an important index of the performance of the unmanned aerial vehicle, and reflects the stability and the precision of a flight control system which is the core of the unmanned aerial vehicle. At present, in the process of detecting and testing the flight function of the unmanned aerial vehicle, the measurement method of hovering precision is a manual observation method, and the safety, objectivity and normalization of the method cannot be guaranteed.
The three-dimensional space positioning technology of the unmanned aerial vehicle mainly has two modes: the positioning mode based on airborne equipment and the positioning mode based on ground equipment.
According to the type of the airborne equipment, there are mainly 3 positioning technologies based on the airborne equipment: fig. 4(a) GPS device based, fig. 4(b) onboard video based, and fig. 4(c) inertial navigation device based positioning techniques. The airborne equipment is integrated in the flight control system of each unmanned aerial vehicle, rather than being independent of the unmanned aerial vehicle, and therefore the flexibility is poor, and is not suitable for the task of positioning different unmanned aerial vehicles.
The above problems can be avoided by a ground based positioning approach. According to different types of ground equipment, the ground equipment-based positioning technology can be divided into 3 types: fig. 4(d) ultrasonic rangefinder, fig. 4(e) laser rangefinder, and fig. 4(f) machine vision based positioning techniques. The ultrasonic or laser range finder is mainly used for measuring the distance of a target, and the three-dimensional laser range finder is low in scanning speed, mainly applied to reconstruction of a static three-dimensional scene and incapable of calculating the motion track of the target.
Disclosure of Invention
The invention aims to solve the problems and provides a binocular vision-based unmanned aerial vehicle hovering precision measuring method, which can calculate the three-dimensional flight track of an unmanned aerial vehicle in real time, automatically calculate hovering precision and improve the accuracy and the normalization of measurement.
In order to achieve the purpose, the invention adopts the following technical scheme:
the unmanned aerial vehicle hovering precision measurement method based on binocular vision comprises the following steps:
step (1): a calibration stage: calibrating the camera by using a Zhangyingyou chessboard calibration method, thereby determining calibration parameters and defining calibration result parameters;
step (2): a positioning stage: when the hovering precision of the unmanned aerial vehicle is measured, a slide rail is placed under a hovering point of the unmanned aerial vehicle, binocular cameras are parallelly fixed on the slide rail according to a set distance and can move along the slide rail, camera lenses are vertically placed upwards, imaging planes of the binocular cameras are located on the same plane, and optical axes are parallel to each other; the left eye camera and the right eye camera respectively collect images of the unmanned aerial vehicle and transmit the images to the computer; the computer calculates the three-dimensional position coordinate of the unmanned aerial vehicle according to the collected left eye image and the right eye image and by combining the calibration result parameters obtained in the step (1); and after hovering, calculating hovering precision according to the three-dimensional track of the unmanned aerial vehicle.
The step (1) comprises the following steps:
step (1-1): fixing two cameras on the same slide rail, defining a distance L, and adjusting the positions of the two cameras on the slide rail to enable the distance between the center points of the two cameras to be L;
step (1-2): calibrating the camera by adopting Zhangzhen chessboard calibration method, and recording calibration result parameter result ═ Mleft,Dleft,Mright,DrightR, T }. result is a parameter of calibration result, MleftAnd DleftRepresenting the camera matrix and distortion coefficient vector, M, of the left eye camera, respectivelyrightAnd DrightRespectively representing the camera matrix and distortion coefficient vector of the right eye camera, and R and T respectively representing the rotation matrix and translation vector between the two cameras. For each of the cameras, the camera is,
M = f x 0 c x 0 f y c y 0 0 1 ;
where M is a camera matrix, fx,fyIs the focal length in pixels.
The step (2) comprises the following steps:
step (2-1): when the hovering precision of the unmanned aerial vehicle is measured, a sliding rail is placed right below a hovering point of the unmanned aerial vehicle, a binocular camera is fixed on the sliding rail in parallel, a camera lens is placed vertically upwards, two camera imaging planes are located on the same plane, and optical axes are parallel to each other; the left eye camera acquires a left eye image of the unmanned aerial vehicle, and the right eye camera acquires a right eye image of the unmanned aerial vehicle;
step (2-2): positioning a target area: selecting a target area in the left eye image in a manual selection mode;
step (2-3): target tracking: tracking a target in the left target image by using a TLD algorithm;
step (2-4): target matching: searching a matching area which is most similar to a target area of the left eye image in the right eye image;
step (2-5): matching points with the same name: respectively using the central points of the rectangular target areas in the left eye image and the right eye image as homonymous points;
step (2-6): and (3) calculating three-dimensional coordinates: establishing a coordinate system of the camera, and calculating the three-dimensional coordinates of the target point in the coordinate system of the camera by combining the calibration result parameters obtained in the step (1);
step (2-7): and (5) hovering precision evaluation, namely calculating the hovering precision of the unmanned aerial vehicle according to the three-dimensional coordinate track of the target point.
The step (2-2) comprises the following steps: let time t at which positioning of the target is started be 0. Firstly, manually framing a target area, wherein the target area is BLA rectangular area with a height h and a width w, which is the upper left corner point.
The step (2-3) comprises the following steps: and tracking the target of the left-eye image by adopting a TLD algorithm at the moment when t is 1 and later according to the target area of the left-eye image determined at the moment when t is 0.
The steps of the step (2-4) are as follows: after the target area in the left eye image is obtained every time, the left eye image is searched in the right eye imageThe matching region with the most similar target region of the image is BRA rectangular area with a height h and a width w and an upper left corner point;
the target match is then expressed as:
min x R , y R Σ i = 0 h Σ j = 0 w | I l e f t [ x L + i ] [ y L + j ] - I r i g h t [ x R + i ] [ y R + j ] | - - - ( 1 )
wherein, IleftRepresenting the gray value of the left eye image, IrightRepresenting the gray value of the right eye image, (x)L,yL) Represents point BL(x) of (C)R,yR) Represents point BRThe coordinates of (a); at this time, the search range xR∈[0,xL],Wherein s ishIs the height of the search area. Thus, B is obtained which minimizes the formula (1)RPoint coordinates (x)R,yR) Then, the parallax d is xL-xR
At the moment t is larger than or equal to 1, after the left eye image obtains a new target area through the TLD algorithm, the searching range of the right eye image is updated x R ∈ [ x L - d - s w 2 , x L - d + s w 2 ] , y R ∈ [ y L - s h 2 , y L + s h 2 ] , Determining a target area in the right eye image according to formula (1); and so on to calculate the target area in the left eye image of each frame and the pair of the same target in the right eye imageThe region should be.
The steps of the step (2-5) are as follows:
using the central point of the target area in the left eye imageAs the homonymous point of the left eye target, the central point of the target area in the right eye image is usedAs the homonym of the right-eye target.
The homonymous points, i.e. the pixels corresponding to the same position of the actual target in the left eye image and the right eye image, must be ensured to correspond to the same position of the actual target in the homonymous points in the front and rear frames and the left and right eye images.
The steps of the step (2-6) are as follows:
the camera coordinate system is the optical center O of the left eye cameraLAs the origin, XOLThe Y plane is parallel to the imaging plane, the optical axis direction is the Z axis, and a reprojection matrix is obtained according to the calibrated camera parameter result
Q = 1 0 0 - c x l 0 1 0 - c y l 0 0 0 f l 0 0 - 1 / T x ( c x l - c x r ) / T x - - - ( 2 )
Wherein,is the coordinates of the principal point of the left camera,is the principal point coordinate of the right camera; t isxIs the X-axis component of the translation matrix between the two cameras; f. oflIs the left camera focal length;
Under the condition that the left and right eye optical axes are parallel to each other, the coordinate P of the homonymous point of the left eye image is knownL(xL,yL) Coordinates P of points with same name as the right eye imageR(xR,yR) Calculating the parallax d ═ x of the target point in the left and right viewsL-xRThen give an order
x ^ c y ^ c z ^ c w ^ c = Q x L y L d 1 = x L - c x l y L - c y l f l - d + c x l - c x r T x - - - ( 3 )
Obtaining the three-dimensional coordinates of the target point in a camera coordinate system:
P c = ( x c , y c , z c ) = ( x ^ c / w ^ c , y ^ c / w ^ c , z ^ c / w ^ c ) - - - ( 4 ) ;
whereinAndis an intermediate result variable, xc、ycAnd zcAre respectively target points PcX, Y and Z-axis coordinates in the camera coordinate system.
The steps of the step (2-7) are as follows:
when the unmanned aerial vehicle suspends, calculating the three-dimensional position coordinate of the unmanned aerial vehicle in real time by adopting the step (2-5) to obtain the flight track of the unmanned aerial vehicle;
suppose that the set of the obtained track points is P ═ P1,P2,...,PN]A total of N points, wherein
Pn=[xn,yn,zn]T,n=1,2,...,N;
Center of mass of set of flight trajectory points P m = [ x m , y m , z m ] T = 1 N Σ n = 1 N P n ;
When the hovering precision of the unmanned aerial vehicle is tested, the ground clearance of hovering of the unmanned aerial vehicle is specified and recorded as H0
When hovering precision detection is carried out, a binocular camera is placed under hovering, specifically, a left eye camera is placed at a hovering point, and an optical axis is perpendicular to a horizontal plane; dividing hovering precision into horizontal deviation of fixed-point hovering precisionAnd vertical deviationHorizontal deviation of hover control accuracyAnd vertical deviation
Due to OLAs the origin of the coordinate system, the calculation formula is as follows:
E H o v i n g h = x m 2 + y m 2 ,
E H o v i n g v = | z m - H 0 | ,
E C o n t r o l h = e X 2 + e Y 2 2 ,
E C o n t r o l v = m a x n ∈ [ 1 , N ] ( z n ) - min n ∈ [ 1 , N ] ( z n ) 2
wherein e isXAnd eYRespectively, in the hovering process of the unmanned aerial vehicle, under a camera coordinate system taking a left eye camera as an origin, the motion ranges in the X-axis direction and the Y-axis direction, namely znRepresenting unmanned aerial vehicle flight track point PnZ-axis coordinate of (a).
The invention has the beneficial effects that:
1, the three-dimensional flight track of the unmanned aerial vehicle can be calculated in real time, the hovering precision is automatically calculated, and the measurement accuracy and normalization are improved;
2, the unmanned aerial vehicle does not need to be transformed, and the system has better expandability;
3, the whole measuring process almost does not need human participation, and the automation degree is high;
the use method of the equipment 4 is simple and convenient, and the flight track and the hovering precision of the unmanned aerial vehicle can be obtained at the computer end only by placing the binocular camera below the preset hovering point.
And 5, a three-dimensional positioning technology based on machine vision can integrate the processes of detection, tracking, matching and three-dimensional positioning of the target, and all related algorithms are completed and displayed on a computer only by placing the camera at a specified position. Thus, for the inspection detection task, an effective method is: by using a binocular vision technology, the special conditions and requirements of the unmanned aerial vehicle inspection and detection tasks are combined, and the detection, tracking, accurate matching and three-dimensional positioning of the target of the unmanned aerial vehicle are realized.
Drawings
FIG. 1 is an overall flow chart of the present invention;
FIG. 2 is a flow chart of the calibration phase of the present invention;
FIG. 3 is a flow chart of the positioning phase of the present invention;
FIG. 4(a) is a positioning mode based on GPS onboard equipment;
FIG. 4(b) is a positioning mode based on an onboard video device;
FIG. 4(c) is a positioning mode based on an onboard inertial navigation device;
FIG. 4(d) is a positioning mode based on a ground ultrasonic range finder;
FIG. 4(e) is a positioning method based on a ground laser range finder;
FIG. 4(f) is a positioning method based on ground machine vision;
FIG. 5 shows a binocular vision-based unmanned aerial vehicle hovering precision measurement system;
FIG. 6 is a schematic view of binocular camera calibration;
FIG. 7 target tracking and matching;
FIG. 8 shows homonym matching between left and right eyes and between front and rear frames;
FIG. 9 is a schematic view of a coordinate system of the binocular positioning system;
FIG. 10 hover error diagram.
Detailed Description
The invention is further described with reference to the following figures and examples.
The hardware is composed of: 2 industrial cameras, 1 camera slide rail, 1 industrial personal computer and display, and 1 chessboard calibration version. The whole system structure is shown in fig. 5.
1. 2: an industrial camera. Lens focal length 5mm, resolution 1384x1032, frame rate 16 frames/second.
3: camera slide rail. The length is 1.2m, the scale marks are arranged, and the division value is 1 mm.
4: industrial computer and display. Linghua MXC-6000 industrial personal computer.
5: and (5) a chessboard calibration board. The number of squares is 19X 17, and the width of each square is 20 mm.
6: unmanned aerial vehicle. Unmanned helicopter, many rotor unmanned aerial vehicle.
As shown in fig. 1, the whole process is divided into 2 stages: a calibration phase and a positioning phase. In the calibration stage, a Zhangzhengyou chessboard calibration method is utilized[1]And calibrating the camera. And in the positioning stage, calculating the three-dimensional position of the unmanned aerial vehicle target according to the calibration parameters.
3.2 calibration stage:
the theoretical problem of the calibration technology related algorithm is basically solved. The current camera calibration mainly comprises three methods: a conventional calibration method, an active vision calibration method, and a self-calibration method. The traditional calibration method utilizes a known calibration template of geometric constraint to calculate the camera parameters, has simple and convenient equipment and high precision, and is the most common method at present. The active visual calibration method fixes the camera on mechanical mechanisms such as a tripod head and the like, strictly limits the rotation and translation motion of the camera, and has high calibration precision, complex equipment and long calibration time; the self-calibration method only calculates the camera parameters by depending on the corresponding relation among a plurality of images in a scene, is flexible and convenient to calibrate, belongs to nonlinear calibration, and is low in robustness.
The present invention utilizes a conventional calibration method, as described in document [1 ]]And expanding on the basis of the method. Define calibration result as { M ═ Mleft,Dleft,Mright,DrightR, T }. result represents the calibration result, MleftAnd DleftRepresenting the camera matrix and distortion coefficient vector, M, of the eye camera, respectivelyrightAnd DrightRespectively representing the camera matrix and distortion coefficient vector of the right eye camera, and R and T respectively representing the rotation matrix and translation vector between the two cameras. As shown in fig. 2 and 6, the calibration steps are as follows:
1 two cameras are fixed on the same slide rail, a distance L is defined, and the positions of the two cameras on the slide rail are adjusted to enable the distance between the center points of the two cameras to be L.
2, calibrating the camera by adopting a Zhangyingyou chessboard calibration method, and recording the parameters of the calibration result as follows:
result={Mleft,Dleft,Mright,Dright,R,T}
where result represents the calibration result, MleftAnd DleftRepresenting the camera matrix and distortion coefficient vector, M, of the eye camera, respectivelyrightAnd DrightRespectively representing right-eye camerasCamera matrix and distortion coefficient vector, R and T represent rotation matrix and translation vector between two cameras respectively.
3.3 positioning stage:
as shown in fig. 5, when the unmanned aerial vehicle hovering precision measurement is carried out, the binocular cameras are parallelly fixed on the slide rail, the two-phase camera imaging planes are located on the same plane as much as possible, and the optical axes are parallel to each other. Place the slide rail under unmanned aerial vehicle suspension point, camera lens is upwards placed perpendicularly. The left eye camera and the right eye camera collect images of the unmanned aerial vehicle and transmit the images to the portable computer through the GigE gigabit network. And the portable computer adopts a correlation algorithm to position the three-dimensional position coordinates of the unmanned aerial vehicle according to the left eye image and the right eye image. And after hovering, calculating hovering precision according to the three-dimensional track of the unmanned aerial vehicle.
As shown in fig. 3, the three-dimensional position positioning algorithm of the drone is mainly divided into the following steps: positioning and tracking a target area, matching left and right target objects, matching homonymous points, calculating three-dimensional coordinates and evaluating hovering precision.
Target area location and tracking
The target area can be positioned by manual selection or automatic detection. Because the background of the target image of the unmanned aerial vehicle is a static single background, a salient target detection mode can be adopted to obtain a target area. As shown in fig. 7, at the time t is 0, after the target position (solid-line rectangular frame) in the left eye image is obtained by manual selection or automatic Detection, the TLD (Tracking-Learning-Detection) algorithm is used[2]And (6) tracking. The advantage of the TLD algorithm is that it can still detect and track a target as it moves out of and re-enters the image area.
Left and right eye target matching
Assuming that the target region of the left eye image is determined for the first time at time t 0, the target region is B as shown in fig. 7LA rectangular area with a height h and a width w, which is the upper left corner point. The left and right target matching is to search a matching area most similar to the left target area in the right target imageThe matching area is BRA rectangular area with a height h and a width w, which is the upper left corner point. Then the target match can be expressed as the following problem:
min x R , y R Σ i = 0 h Σ j = 0 w | I l e f t [ x L + i ] [ y L + j ] - I r i g h t [ x R + i ] [ y R + j ] | formula (1)
Wherein, IleftAnd IrightRepresenting the gray values of the left and right eye images, respectively, (x)L,yL) And (x)R,yR) Are respectively point BLAnd BRThe coordinates of (a). At this time, the search range xR∈[0,xL],As shown by the gray area in the image at the moment when t is 0 in fig. 7. Obtaining (x) that minimizes equation (1)R,yR) Then, the parallax d is xL-xR
At the moment of t +1, the left eye image obtains a new target area through the TLD algorithm, and the searching range of the right eye image is updated x R ∈ [ x L - d - s w 2 , x L - d + s w 2 ] , y R ∈ [ y L - s h 2 , y L + s h 2 ] , As indicated by the gray area in the image at time t + 1. The target region in the right eye image is determined according to equation (1). And the like, calculating the target area of the left eye image and the right eye image of each frame.
Matching points of same name
The homonymous points, namely pixel points corresponding to the same part of the actual target in the left eye image and the right eye image. As shown in FIG. 8, we showIt must be ensured that the homonym points in the front and rear frames, the left and right eye images correspond to the same position of the actual target. A simple way is to use the center point of the target area in the image, i.e. the P L = ( x L + w 2 , y L + h 2 ) And P R = ( x R + w 2 , y R + h 2 ) as the homonym of the left and right eye targets.
Three-dimensional coordinate calculation
The camera coordinate system is the optical center O of the left eye cameraLAs the origin, XOLThe Y-plane is parallel to the imaging plane and the optical axis is the Z-axis, as shown in fig. 9. Obtaining a reprojection matrix according to the calibrated camera parameters
Q = 1 0 0 - c x l 0 1 0 - c y l 0 0 0 f l 0 0 - 1 / T x ( c x l - c x r ) / T x . Formula (2)
WhereinAs principal point coordinates of left and right cameras (not used in the formula));TxIs the X-axis component of the translation matrix between the two cameras; f. oflLeft camera focal length. Under the condition that the left and right eye optical axes are parallel to each other, the coordinates P of the homonymous points of the left and right eye images are knownL(xL,yL) And PR(xR,yR) Calculating the parallax d ═ x of the target point in the left and right viewsL-xRThen give an order
x ^ c y ^ c z ^ c w ^ c = Q x L y L d 1 = x L - c x l y L - c y l f l - d + c x l - c x r T x . Formula (3)
This results in the three-dimensional coordinates of the target point in the camera coordinate system:
P c = ( x c , y c , z c ) = ( x ^ c / w ^ c , y ^ c / w ^ c , z ^ c / w ^ c ) . formula (4)
Hover precision assessment
When the unmanned aerial vehicle is suspended, the three-dimensional position coordinates of the unmanned aerial vehicle are calculated in real time by adopting the method, and the flight track of the unmanned aerial vehicle is obtained. Suppose that the set of the obtained track points is P ═ P1,P2,...,PN]A total of N points, wherein
Pn=[xn,yn,zn]TN is 1, 2. The points in fig. 10 are projections of the flight trajectory points in the horizontal plane. Center of mass of set of flight trajectory pointsWhen the hovering precision is detected, the binocular camera is placed under the hovering partSpecifically, the left eye camera is placed at the suspension point and the optical axis is perpendicular to the horizontal plane. Horizontal deviation of stopping precision divided into fixed-point hovering precisionAnd vertical deviationHorizontal deviation of hover control accuracyAnd vertical deviationDue to OLAs the origin of the coordinate system, the calculation formula is as follows:
E H o v i n g h = x m 2 + y m 2 ,
E H o v i n g V = | z m - H 0 | ,
E C o n t r o l h = e X 2 + e Y 2 2 ,
E C o n t r o l v = m a x n ∈ [ 1 , N ] ( z n ) - min n ∈ [ 1 , N ] ( z n ) 2 .
reference documents:
[1]ZhangZ.Aflexiblenewtechniqueforcameracalibration[J].PatternAnalysisandMachineIntelligence,IEEETransactionson,2000,22(11):1330-1334.
[2]KalalZ,MikolajczykK,MatasJ.Tracking-learning-detection[J].PatternAnalysisandMachineIntelligence,IEEETransactionson,2012,34(7):1409-1422.
although the embodiments of the present invention have been described with reference to the accompanying drawings, it is not intended to limit the scope of the present invention, and it should be understood by those skilled in the art that various modifications and variations can be made without inventive efforts by those skilled in the art based on the technical solution of the present invention.

Claims (9)

1. The unmanned aerial vehicle hovering precision measurement method based on binocular vision is characterized by comprising the following steps:
step (1): a calibration stage: calibrating the camera by using a Zhangyingyou chessboard calibration method, thereby determining calibration parameters and defining calibration result parameters;
step (2): a positioning stage: when the hovering precision of the unmanned aerial vehicle is measured, a slide rail is placed under a hovering point of the unmanned aerial vehicle, binocular cameras are parallelly fixed on the slide rail according to a set distance and can move along the slide rail, camera lenses are vertically placed upwards, imaging planes of the binocular cameras are located on the same plane, and optical axes are parallel to each other; the left eye camera and the right eye camera respectively collect images of the unmanned aerial vehicle and transmit the images to the computer; the computer calculates the three-dimensional position coordinate of the unmanned aerial vehicle according to the collected left eye image and the right eye image and by combining the calibration result parameters obtained in the step (1); and after hovering, calculating hovering precision according to the three-dimensional track of the unmanned aerial vehicle.
2. The binocular vision based unmanned aerial vehicle hovering precision measuring method according to claim 1, wherein the step (1) is:
step (1-1): fixing two cameras on the same slide rail, defining a distance L, and adjusting the positions of the two cameras on the slide rail to enable the distance between the center points of the two cameras to be L;
step (1-2): calibrating the camera by adopting Zhangzhen chessboard calibration method, and recording calibration result parameter result ═ Mleft,Dleft,Mright,DrightR, T }; result represents the calibration result, MleftAnd DleftRepresenting the camera matrix and distortion coefficient vector, M, of the eye camera, respectivelyrightAnd DrightRespectively representing a camera matrix and a distortion coefficient vector of a right-eye camera, and R and T respectively representing a rotation matrix and a translation vector between the two cameras; for each of the cameras, the camera is,
M = f x 0 c x 0 f y c y 0 0 1 .
3. the binocular vision based unmanned aerial vehicle hovering precision measuring method of claim 1, wherein the step (2) is:
step (2-1): when the hovering precision of the unmanned aerial vehicle is measured, the slide rail is placed right below a hovering point of the unmanned aerial vehicle, the binocular cameras are fixed on the slide rail in parallel, the camera lenses are placed vertically upwards, imaging planes of the binocular cameras are located on the same plane, and optical axes are parallel to each other; the left eye camera acquires a left eye image of the unmanned aerial vehicle, and the right eye camera acquires a right eye image of the unmanned aerial vehicle;
step (2-2): detecting a target area: acquiring a target area in the left eye image by adopting a saliency target detection mode;
step (2-3): target tracking: tracking a target in the left target image by using a TLD algorithm;
step (2-4): target matching: searching a matching area which is most similar to a target area of the left eye image in the right eye image;
step (2-5): matching points with the same name: respectively using the central points of the rectangular target areas in the left eye image and the right eye image as homonymous points;
step (2-6): and (3) calculating three-dimensional coordinates: establishing a coordinate system of the camera, and calculating the three-dimensional coordinates of the target point in the coordinate system of the camera by combining the calibration result parameters obtained in the step (1);
step (2-7): and (5) hovering precision evaluation, namely calculating the hovering precision of the unmanned aerial vehicle according to the three-dimensional coordinate track of the target point.
4. The binocular vision based unmanned aerial vehicle hovering precision measuring method according to claim 3, wherein the step of the step (2-2)The method comprises the following steps: setting the time t for starting positioning the target to be 0, firstly adopting a significant target detection mode to obtain a target area, wherein the target area is BLA rectangular area with a height h and a width w, which is the upper left corner point.
5. The binocular vision based unmanned aerial vehicle hovering precision measuring method according to claim 3, wherein the step (2-3) is: and tracking the target of the left-eye image by adopting a TLD algorithm at the moment when t is 1 and later according to the target area of the left-eye image determined at the moment when t is 0.
6. The binocular vision based unmanned aerial vehicle hovering precision measuring method according to claim 3, wherein the step (2-4) is: after the target area in the left eye image is obtained every time, searching a matching area which is most similar to the target area of the left eye image in the right eye image, wherein the matching area is BRA rectangular area with a height h and a width w and an upper left corner point;
the target match is then expressed as:
min x R , y R Σ i = 0 h Σ j = 0 w | I l e f t [ x L + i ] [ y L + j ] - I r i g h t [ x R + i ] [ y R + j ] | - - - ( 1 )
wherein, IleftRepresenting the gray value of the left eye image, IrightRepresenting the gray value of the right eye image, (x)L,yL) Coordinates representing point BL, (x)R,yR) Represents point BRThe coordinates of (a); at this time, the search range xR∈[0,xL],Wherein s ishFor the height of the search area, B is obtained which minimizes equation (1)RPoint coordinates (x)R,yR) Then, parallax
d=xL-xR
At the moment t is larger than or equal to 1, after the left eye image obtains a new target area through the TLD algorithm, the searching range of the right eye image is updated x R ∈ [ x L - d - s w 2 , x L - d + s w 2 ] , y R ∈ [ y L - s h 2 , y L + s h 2 ] , Determining a target area in the right eye image according to formula (1); and the like, calculating a target area in the left-eye image of each frame and a corresponding area of the same target in the right-eye image.
7. The binocular vision based unmanned aerial vehicle hovering precision measuring method according to claim 3, wherein the step (2-5) is:
using the central point of the target area in the left eye imageAs the homonymous point of the left eye target, the central point of the target area in the right eye image is usedAs the homonym of the right-eye target.
8. The binocular vision based unmanned aerial vehicle hovering precision measuring method according to claim 3, wherein the step (2-6) is:
the camera coordinate system is the optical center O of the left eye cameraLAs the origin, XOLThe Y plane is parallel to the imaging plane, the optical axis direction is the Z axis, and a reprojection matrix is obtained according to the calibrated camera parameters
Q = 1 0 0 - c x l 0 1 0 - c y l 0 0 0 f l 0 0 - 1 / T x ( c x l - c x r ) / T x - - - ( 2 )
Wherein,is the coordinates of the principal point of the left camera,is the principal point coordinate of the right camera; t isxIs the X-axis component of the translation matrix between the two cameras; f. oflIs the left camera focal length;
under the condition that the left and right eye optical axes are parallel to each other, the coordinate P of the homonymous point of the left eye image is knownL(xL,yL) Coordinates P of points with same name as the right eye imageR(xR,yR) Calculating the parallax d ═ x of the target point in the left and right viewsL-xRThen give an order
x ^ c y ^ c z ^ c w ^ c = Q x L y L d 1 = x L - c x l y L - c y l f l - d + c x l - c x r T x - - - ( 3 )
Obtaining the three-dimensional coordinates of the target point in a camera coordinate system:
P c = ( x c , y c , z c ) = ( x ^ c / w ^ c , y ^ c / w ^ c , z ^ c / w ^ c ) - - - ( 4 ) .
9. the binocular vision based unmanned aerial vehicle hovering precision measuring method according to claim 3, wherein the step (2-7) is:
when the unmanned aerial vehicle suspends, calculating the three-dimensional position coordinate of the unmanned aerial vehicle in real time by adopting the step (2-5) to obtain the flight track of the unmanned aerial vehicle;
suppose that the set of the obtained track points is P ═ P1,P2,...,PN]A total of N points, wherein
Pn=[xn,yn,zn]T,n=1,2,...,N;
Center of mass of set of flight trajectory points P m = [ x m , y m , z m ] T = 1 N Σ n = 1 N P n ;
When the hovering precision of the unmanned aerial vehicle is tested, the hovering separation of the unmanned aerial vehicle is specifiedHeight of the ground is recorded as H0
When hovering precision detection is carried out, a binocular camera is placed under hovering, specifically, a left eye camera is placed at a hovering point, and an optical axis is perpendicular to a horizontal plane; dividing hovering precision into horizontal deviation of fixed-point hovering precisionAnd vertical deviationHorizontal deviation of hover control accuracyAnd vertical deviation
Due to OLAs the origin of the coordinate system, the calculation formula is as follows:
E H o v i n g h = x m 2 + y m 2 ,
E H o v i n g v = | z m - H 0 | ,
E C o n t r o l h = e X 2 + e Y 2 2 ,
E C o n t r o l v = max n ∈ [ 1 , N ] ( z n ) - min n ∈ [ 1 , N ] ( z n ) 2
wherein e isXAnd eYRespectively, in the hovering process of the unmanned aerial vehicle, under a camera coordinate system taking a left eye camera as an origin, the motion ranges in the X-axis direction and the Y-axis direction, namely znRepresenting unmanned aerial vehicle flight track point PnZ-axis coordinate of (a).
CN201510736167.9A 2015-11-02 2015-11-02 Unmanned plane hovering accuracy measurement method based on binocular vision Active CN105424006B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510736167.9A CN105424006B (en) 2015-11-02 2015-11-02 Unmanned plane hovering accuracy measurement method based on binocular vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510736167.9A CN105424006B (en) 2015-11-02 2015-11-02 Unmanned plane hovering accuracy measurement method based on binocular vision

Publications (2)

Publication Number Publication Date
CN105424006A true CN105424006A (en) 2016-03-23
CN105424006B CN105424006B (en) 2017-11-24

Family

ID=55502384

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510736167.9A Active CN105424006B (en) 2015-11-02 2015-11-02 Unmanned plane hovering accuracy measurement method based on binocular vision

Country Status (1)

Country Link
CN (1) CN105424006B (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105957109A (en) * 2016-04-29 2016-09-21 北京博瑞爱飞科技发展有限公司 Target tracking method and device
CN106020218A (en) * 2016-05-16 2016-10-12 国家电网公司 UAV (unmanned aerial vehicle) hovering precision test method and system
CN106153008A (en) * 2016-06-17 2016-11-23 北京理工大学 A kind of rotor wing unmanned aerial vehicle objective localization method of view-based access control model
CN106709955A (en) * 2016-12-28 2017-05-24 天津众阳科技有限公司 Space coordinate system calibrate system and method based on binocular stereo visual sense
CN107036625A (en) * 2016-02-02 2017-08-11 中国电力科学研究院 A kind of flying quality detection method of power transmission line unmanned helicopter patrol inspection system
CN107300377A (en) * 2016-11-01 2017-10-27 北京理工大学 A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion
CN107424156A (en) * 2017-06-28 2017-12-01 北京航空航天大学 Unmanned plane autonomous formation based on Fang Cang Owl eye vision attentions accurately measures method
CN107490375A (en) * 2017-09-21 2017-12-19 重庆鲁班机器人技术研究院有限公司 Spot hover accuracy measuring device, method and unmanned vehicle
CN108489454A (en) * 2018-03-22 2018-09-04 沈阳上博智像科技有限公司 Depth distance measurement method, device, computer readable storage medium and electronic equipment
CN108965651A (en) * 2017-05-19 2018-12-07 深圳市道通智能航空技术有限公司 A kind of drone height measurement method and unmanned plane
CN109211185A (en) * 2017-06-30 2019-01-15 北京臻迪科技股份有限公司 A kind of flight equipment, the method and device for obtaining location information
CN109211573A (en) * 2018-09-12 2019-01-15 北京工业大学 A kind of evaluating method of unmanned plane hoverning stability
CN109360240A (en) * 2018-09-18 2019-02-19 华南理工大学 A kind of small drone localization method based on binocular vision
CN109813509A (en) * 2019-01-14 2019-05-28 中山大学 The method that high-speed rail bridge vertically moves degree of disturbing measurement is realized based on unmanned plane
CN109855822A (en) * 2019-01-14 2019-06-07 中山大学 A kind of high-speed rail bridge based on unmanned plane vertically moves degree of disturbing measurement method
CN110986891A (en) * 2019-12-06 2020-04-10 西北农林科技大学 System for accurately and rapidly measuring crown width of tree by using unmanned aerial vehicle
CN111688949A (en) * 2020-06-24 2020-09-22 天津大学 Unmanned aerial vehicle hovering attitude measurement device and method
CN112188112A (en) * 2020-09-28 2021-01-05 苏州臻迪智能科技有限公司 Light supplement control method, light supplement control device, storage medium and electronic equipment
CN112365526A (en) * 2020-11-30 2021-02-12 湖南傲英创视信息科技有限公司 Binocular detection method and system for weak and small targets
CN114818546A (en) * 2022-05-24 2022-07-29 重庆大学 Unmanned aerial vehicle hovering wind resistance performance two-dimensional evaluation method based on error sorting
CN114877876A (en) * 2022-07-12 2022-08-09 南京市计量监督检测院 Unmanned aerial vehicle hovering precision evaluation method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101489149A (en) * 2008-12-25 2009-07-22 清华大学 Binocular tri-dimensional video collecting system
CN101876532A (en) * 2010-05-25 2010-11-03 大连理工大学 Camera on-field calibration method in measuring system
CN102967305A (en) * 2012-10-26 2013-03-13 南京信息工程大学 Multi-rotor unmanned aerial vehicle pose acquisition method based on markers in shape of large and small square
US20130070961A1 (en) * 2010-03-23 2013-03-21 Omid E. Kia System and Method for Providing Temporal-Spatial Registration of Images
CN104006803A (en) * 2014-06-20 2014-08-27 中国人民解放军国防科学技术大学 Camera shooting measurement method for rotation motion parameters of spinning stability spacecraft
WO2015105756A1 (en) * 2014-01-10 2015-07-16 Microsoft Technology Licensing, Llc Increasing touch and/or hover accuracy on a touch-enabled device
CN104932523A (en) * 2015-05-27 2015-09-23 深圳市高巨创新科技开发有限公司 Positioning method and apparatus for unmanned aerial vehicle

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101489149A (en) * 2008-12-25 2009-07-22 清华大学 Binocular tri-dimensional video collecting system
US20130070961A1 (en) * 2010-03-23 2013-03-21 Omid E. Kia System and Method for Providing Temporal-Spatial Registration of Images
CN101876532A (en) * 2010-05-25 2010-11-03 大连理工大学 Camera on-field calibration method in measuring system
CN102967305A (en) * 2012-10-26 2013-03-13 南京信息工程大学 Multi-rotor unmanned aerial vehicle pose acquisition method based on markers in shape of large and small square
WO2015105756A1 (en) * 2014-01-10 2015-07-16 Microsoft Technology Licensing, Llc Increasing touch and/or hover accuracy on a touch-enabled device
CN104006803A (en) * 2014-06-20 2014-08-27 中国人民解放军国防科学技术大学 Camera shooting measurement method for rotation motion parameters of spinning stability spacecraft
CN104932523A (en) * 2015-05-27 2015-09-23 深圳市高巨创新科技开发有限公司 Positioning method and apparatus for unmanned aerial vehicle

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107036625A (en) * 2016-02-02 2017-08-11 中国电力科学研究院 A kind of flying quality detection method of power transmission line unmanned helicopter patrol inspection system
CN105957109A (en) * 2016-04-29 2016-09-21 北京博瑞爱飞科技发展有限公司 Target tracking method and device
CN106020218A (en) * 2016-05-16 2016-10-12 国家电网公司 UAV (unmanned aerial vehicle) hovering precision test method and system
CN106020218B (en) * 2016-05-16 2018-11-13 国家电网公司 A kind of the hovering method for testing precision and system of unmanned plane
CN106153008B (en) * 2016-06-17 2018-04-06 北京理工大学 A kind of rotor wing unmanned aerial vehicle objective localization method of view-based access control model
CN106153008A (en) * 2016-06-17 2016-11-23 北京理工大学 A kind of rotor wing unmanned aerial vehicle objective localization method of view-based access control model
CN107300377A (en) * 2016-11-01 2017-10-27 北京理工大学 A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion
CN107300377B (en) * 2016-11-01 2019-06-14 北京理工大学 A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion
CN106709955A (en) * 2016-12-28 2017-05-24 天津众阳科技有限公司 Space coordinate system calibrate system and method based on binocular stereo visual sense
CN106709955B (en) * 2016-12-28 2020-07-24 天津众阳科技有限公司 Space coordinate system calibration system and method based on binocular stereo vision
CN108965651A (en) * 2017-05-19 2018-12-07 深圳市道通智能航空技术有限公司 A kind of drone height measurement method and unmanned plane
CN107424156A (en) * 2017-06-28 2017-12-01 北京航空航天大学 Unmanned plane autonomous formation based on Fang Cang Owl eye vision attentions accurately measures method
CN107424156B (en) * 2017-06-28 2019-12-06 北京航空航天大学 Unmanned aerial vehicle autonomous formation accurate measurement method based on visual attention of barn owl eyes
CN109211185A (en) * 2017-06-30 2019-01-15 北京臻迪科技股份有限公司 A kind of flight equipment, the method and device for obtaining location information
CN107490375A (en) * 2017-09-21 2017-12-19 重庆鲁班机器人技术研究院有限公司 Spot hover accuracy measuring device, method and unmanned vehicle
CN108489454A (en) * 2018-03-22 2018-09-04 沈阳上博智像科技有限公司 Depth distance measurement method, device, computer readable storage medium and electronic equipment
CN109211573A (en) * 2018-09-12 2019-01-15 北京工业大学 A kind of evaluating method of unmanned plane hoverning stability
CN109360240A (en) * 2018-09-18 2019-02-19 华南理工大学 A kind of small drone localization method based on binocular vision
CN109360240B (en) * 2018-09-18 2022-04-22 华南理工大学 Small unmanned aerial vehicle positioning method based on binocular vision
CN109855822B (en) * 2019-01-14 2019-12-06 中山大学 unmanned aerial vehicle-based high-speed rail bridge vertical dynamic disturbance degree measuring method
CN109855822A (en) * 2019-01-14 2019-06-07 中山大学 A kind of high-speed rail bridge based on unmanned plane vertically moves degree of disturbing measurement method
CN109813509B (en) * 2019-01-14 2020-01-24 中山大学 Method for realizing measurement of vertical dynamic disturbance degree of high-speed rail bridge based on unmanned aerial vehicle
CN109813509A (en) * 2019-01-14 2019-05-28 中山大学 The method that high-speed rail bridge vertically moves degree of disturbing measurement is realized based on unmanned plane
CN110986891A (en) * 2019-12-06 2020-04-10 西北农林科技大学 System for accurately and rapidly measuring crown width of tree by using unmanned aerial vehicle
CN111688949A (en) * 2020-06-24 2020-09-22 天津大学 Unmanned aerial vehicle hovering attitude measurement device and method
CN112188112A (en) * 2020-09-28 2021-01-05 苏州臻迪智能科技有限公司 Light supplement control method, light supplement control device, storage medium and electronic equipment
CN112365526A (en) * 2020-11-30 2021-02-12 湖南傲英创视信息科技有限公司 Binocular detection method and system for weak and small targets
CN112365526B (en) * 2020-11-30 2023-08-25 湖南傲英创视信息科技有限公司 Binocular detection method and system for weak and small targets
CN114818546A (en) * 2022-05-24 2022-07-29 重庆大学 Unmanned aerial vehicle hovering wind resistance performance two-dimensional evaluation method based on error sorting
CN114877876A (en) * 2022-07-12 2022-08-09 南京市计量监督检测院 Unmanned aerial vehicle hovering precision evaluation method

Also Published As

Publication number Publication date
CN105424006B (en) 2017-11-24

Similar Documents

Publication Publication Date Title
CN105424006B (en) Unmanned plane hovering accuracy measurement method based on binocular vision
CN102788559B (en) Optical vision measuring system with wide-field structure and measuring method thereof
EP3158731B1 (en) System and method for adjusting a baseline of an imaging system with microlens array
CN102519434B (en) Test verification method for measuring precision of stereoscopic vision three-dimensional recovery data
CN109859272A (en) A kind of auto-focusing binocular camera scaling method and device
CN111192235A (en) Image measuring method based on monocular vision model and perspective transformation
CN108413917B (en) Non-contact three-dimensional measurement system, non-contact three-dimensional measurement method and measurement device
CN109269525B (en) Optical measurement system and method for take-off or landing process of space probe
CN106408601A (en) GPS-based binocular fusion positioning method and device
WO2011125937A1 (en) Calibration data selection device, method of selection, selection program, and three dimensional position measuring device
CN110136047B (en) Method for acquiring three-dimensional information of static target in vehicle-mounted monocular image
JP2016057063A (en) Non-contact detecting method for measurement objects, and apparatus for the same
CN104729484A (en) Multi-view stereo aerial photographic device for unmanned aerial vehicles and method for determining focal length of multi-view stereo aerial photographic device
CN112129263B (en) Distance measurement method of separated mobile stereo distance measurement camera
CN114812558B (en) Monocular vision unmanned aerial vehicle autonomous positioning method combining laser ranging
Sobel et al. Camera calibration for tracked vehicles augmented reality applications
CN109493378B (en) Verticality detection method based on combination of monocular vision and binocular vision
CN114018291A (en) Calibration method and device for parameters of inertial measurement unit
CN109712200B (en) Binocular positioning method and system based on least square principle and side length reckoning
CN113504385B (en) Speed measuring method and device for plural cameras
CN115049784A (en) Three-dimensional velocity field reconstruction method based on binocular particle image
CN111412898B (en) Large-area deformation photogrammetry method based on ground-air coupling
US11514597B1 (en) Single-camera stereoaerophotogrammetry using UAV sensors
Mulsow et al. A universal approach for geometric modelling in underwater stereo image processing
CN114663486A (en) Building height measurement method and system based on binocular vision

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: Wang Yue Central Road Ji'nan City, Shandong province 250002 City No. 2000

Co-patentee after: National Network Intelligent Technology Co., Ltd.

Patentee after: Electric Power Research Institute of State Grid Shandong Electric Power Company

Co-patentee after: State Grid Corporation of China

Address before: Wang Yue Central Road Ji'nan City, Shandong province 250002 City No. 2000

Co-patentee before: Shandong Luneng Intelligent Technology Co., Ltd.

Patentee before: Electric Power Research Institute of State Grid Shandong Electric Power Company

Co-patentee before: State Grid Corporation of China

CP01 Change in the name or title of a patent holder
TR01 Transfer of patent right

Effective date of registration: 20201029

Address after: 250101 Electric Power Intelligent Robot Production Project 101 in Jinan City, Shandong Province, South of Feiyue Avenue and East of No. 26 Road (ICT Industrial Park)

Patentee after: National Network Intelligent Technology Co.,Ltd.

Address before: Wang Yue Central Road Ji'nan City, Shandong province 250002 City No. 2000

Patentee before: ELECTRIC POWER RESEARCH INSTITUTE OF STATE GRID SHANDONG ELECTRIC POWER Co.

Patentee before: National Network Intelligent Technology Co.,Ltd.

Patentee before: STATE GRID CORPORATION OF CHINA

TR01 Transfer of patent right