CN107065895A - A kind of plant protection unmanned plane determines high-tech - Google Patents
A kind of plant protection unmanned plane determines high-tech Download PDFInfo
- Publication number
- CN107065895A CN107065895A CN201710006131.4A CN201710006131A CN107065895A CN 107065895 A CN107065895 A CN 107065895A CN 201710006131 A CN201710006131 A CN 201710006131A CN 107065895 A CN107065895 A CN 107065895A
- Authority
- CN
- China
- Prior art keywords
- mtd
- msub
- mrow
- mtr
- mfrac
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/04—Control of altitude or depth
- G05D1/042—Control of altitude or depth specially adapted for aircraft
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
Abstract
Determine high-tech the invention discloses a kind of plant protection unmanned plane, comprise the following steps that:Step 1, using the video camera of unmanned plane carry to front side shooting figure piece sequence;Step 2, feature point extraction is carried out to shooting picture;Step 3, Feature Points Matching is carried out to the characteristic point of extraction using SAD algorithms;Step 4, cross bearing;According to angle of the same place of matching to the position of set and aircraft, posture and video camera, the three-dimensional coordinate of each same place is calculated;Step 5, based on above-mentioned localization method, the multistation location model based on weighted least square is set up in the multiple measurement to target point of the same name;Technology disclosed by the invention has preferable antijamming capability, and equipment is simple, cheap;Only need to one picture pick-up device of carry on common plant protection unmanned plane and the fixed height of unmanned plane can be achieved, be easy to deployment, with larger application prospect.
Description
Technical field
The invention belongs to agricultural technology field, high-tech is determined in particular to a kind of plant protection unmanned plane.
Background technology
In recent years, agricultural plant protection unmanned plane is widely used, for these area sprays such as paddy field, hills, mountain region machinery
And the area that fixed-wing manned aircraft can not be put to good use, plant protection unmanned plane have uniqueness advantage.Unmanned plane is in operation
Need to keep 1~2 meter of relative altitude with crops all the time in journey, to ensure the uniform efficient sprinkling of agricultural chemicals.But in reality
In operation process, the uneven whole situation in farmland is frequently encountered, the factor such as weather, illumination also determines Gao Ying to plant protection unmanned plane
Sound is very big, therefore, reasonably determines high scheme for one and is just particularly important.
Current fixed high scheme mainly has GPS, barometer, ultrasonic wave and laser ranging etc..What GPS was obtained is relatively extra large
Degree of lifting, error is big and refreshes slow;What barometer was obtained is absolute altitude, is not aircraft to the distance between crops, and air pressure,
The extraneous factors such as wind cause to survey high inaccurate;Ultrasound height penetrates readily through vegetation, and larger by temperature, pressure influence, anti-interference
Ability is slightly worse;Laser ranging mode precision is high, strong antijamming capability, but equipment price is expensive, is not suitable for widely using.
The fixed high scheme of these tradition has larger limitation, and the consequence brought is that plant protection unmanned plane leaks in operation process
Spray, respray, influence spray effect, or even cause aircraft bombing accident because falling height.Therefore the plant protection that one rational, applicability is wide is proposed
The fixed high scheme of unmanned plane is very necessary.
The content of the invention
For problems of the prior art, high-tech is determined the invention discloses a kind of plant protection unmanned plane, the technology is
Based on stereovision technique, wherein stereovision technique can reconstruct the depth information of imaging region, so as to be view-based access control model
Unmanned plane is fixed high there is provided theoretical foundation and approach, and the present invention installs monocular camera on unmanned plane, from data precision and calculating
Two aspects of real-time consider, and select quasi- dense matching to carry out three-dimensional reconstruction, so that it is guaranteed that plant protection unmanned plane is accurate fixed
It is high.
The present invention is achieved in that a kind of plant protection unmanned plane determines high-tech, it is characterised in that comprise the following steps that:
Step 1, ground scenery is shot to front side using the video camera of unmanned plane carry, and constant duration preserves picture sequence
Row;The purpose for the arrangement is that in view of collection, transmission image and calculate be both needed to elapsed time, in order to ensure surely high data and
Shi Gengxin, therefore use the strategy of video camera forward sight;
Step 2, feature point extraction is carried out to the photo current frame of shooting;
Step 3, it is reverse from the sequence of pictures of preservation to extract n frame pictures, the characteristic point of extraction is carried out using SAD algorithms
Feature Points Matching;
Step 4, if n=2, cross bearing is used;According to the same place of matching to set, the position of aircraft, posture and
The angle of video camera, calculates the three-dimensional coordinate of each same place;
Step 5, if n>2, using the multistation location based on weighted least square, calculate the three-dimensional seat of each same place
Mark.
Further, described step 2 is comprised the following steps that:
2.1, using SILC image segmentation algorithms, super-pixel segmentation is carried out to kth frame image, and calculate each region unit
Center-of-mass coordinate, obtains point set I1;
2.2, the angular coordinate of kth frame image is extracted using Harris Corner Detections, point set I is obtained2, and merge I1And I2,
Obtain feature point set I to be matched somebody with somebodyp。
Further, described step 3 is comprised the following steps that:
3.1, characteristic point periphery chooses a N × n-quadrant and is used as template;
3.2, the matching of the template is then found in the two field picture of kth -1, wherein region of search is limited to (Nsearch×
Nsearch) characteristic point peripheral region;The similarity measurement of matching is represented with following formula:
SAD(dx,dy)=∑ | template (x, y)-imagek (x+dx,y+dy)|
In formula, x, y is characterized coordinate a little, dx, and dy is relative x, y coordinate offset;
3.3, change search window centre point position, to reduce similarity measurement to greatest extent, when similarity measurement is minimum
When, the point is match point;
Further, described step 4 concretely comprises the following steps cross bearing, specific as follows:
If aerial C1And C2Two points are photographed on a surface target, and picture points of the ground target point P on the photograph of left and right is p1With
p2;Obviously, ray C of the same name1p1And C2p2Intersect at ground target point P;
According to perspective projection imaging relation, C can be derived1And C2Imaging collinearity equation be respectively:
Wherein, (xi,yi), i=1,2 is P point actual imaging point coordinates;(Fx,Fy) it is equivalent focal length;(Cx,Cy) it is as main
Point coordinates;It is P points in CiThe coordinate taken the photograph under the camera coordinate system of station;
According to the relative pose relation of camera coordinate system and world coordinate system, it can obtain:
Wherein, (X, Y, Z) is coordinate of the target point in world coordinate system, r0~r8It is that world coordinate system and video camera are sat
The spin matrix component that mark system posture is consistent and needs;Tx, Ty, TzWorld coordinate system origin is moved on to camera coordinate system by representative
The translational movement of origin;
Using the geographic coordinate system of first measurement point as world coordinate system, then first time measurement point Tx=Ty=Tz=0,
The T of second measurement pointx, Ty, TzValue difference positioned by satellite positioning receiver twice calculated;
By Inertial Measurement Unit and camera cradle head, aircraft crab angle φ, angle of pitch γ, roll angle θ and shooting are obtained
The azimuth angle alpha and angle of site β of machine, can be obtained:
Simultaneous can solve point P coordinate (X, Y, Z) with equation.
Further, described step 5 is specific as follows:
Unmanned plane carries out n (n in the flight course of preset flight path to target point>2) secondary shooting, obtains n images;
Then according to collinearity equation, have
Z=H (S)
Wherein:Z=[x1 y1 ... xn yn]T, S=[X, Y, Z]T
Above formula is subjected to first order Taylor expansion at initial value, can be obtained:
Z=H (S0)+B·(S-S0)+Δn
Wherein:
Order
U=Z-H (S0)
V=S-S0
Therefore,
U=BV+ Δs n
According to least-squares estimation, it can obtain
Unmanned plane is in each measurement point, and the posture of aircraft is all different, in this case, even if using same shooting
Machine, but be due to that external parameters of cameras is different, cause each measurement spot placement accuracy difference, the contribution to error is also different.Cause
This, introduces weighted least square;Make R-1For weighting matrix, and
Then
Therefore,
S0It can be tried to achieve according to cross bearing principle.Because the site error of initial value is larger, add what linearisation was brought
Error so that try to achieve for the first timeAnd actual value deviation is larger.Using iterative method, when positioning result tends to stationary value, iteration knot
Beam.
The estimation of weighting matrix is more difficult, generally selects diagonal matrix or simpler unit matrix, although selection
Weight matrix have error, but unbiased esti-mator is still to the weighted least square of unknown parameter.Present invention employs one kind
The method of convenient, science obtains weight matrix, and good effect is obtained in practicality.Its core concept is:For causing error
Larger measurement point, gives less weights, and the less measurement point of error gives larger weights, so as to increase preferably
" contribution " of measurement point, improves the precision of least-squares estimation.In position fixing process, measurement point distance objective point is more remote, positioning
Precision is poorer, and measurement point and the distance of target point are pointed to angle by the elevation and measurement point camera optical axis of measurement point and determined jointly
It is fixed, meet basic triangle relation.It can thus be concluded that,
Wherein, σ is the element in weight matrix, and ε is that camera optical axis points to angle, and H is measurement point height.
The present invention is relative to the beneficial effect of prior art:The present invention can obtain accurately, in real time unmanned plane with
Depth information between crops, compare plant protection larger by temperature, pressure influence with existing plant protection unmanned plane, of the invention without
Man-machine technology has preferable antijamming capability, and equipment is simple, cheap;Only need to the carry on common plant protection unmanned plane
One picture pick-up device is that the fixed height of unmanned plane can be achieved, and is easy to deployment, with larger application prospect.
Brief description of the drawings
Fig. 1 be a kind of plant protection unmanned plane of the invention determine high-tech high-tech box is determined based on monocular sequence image
Figure;
Fig. 2 is the SAD search strategy schematic diagrames that a kind of plant protection unmanned plane of the invention determines high-tech;
Fig. 3 is the cross bearing schematic diagram that a kind of plant protection unmanned plane of the invention determines high-tech.
Embodiment
The present invention provides a kind of plant protection unmanned plane and determines high-tech, to make the purpose of the present invention, technical scheme and effect more
It is clear, clearly, and referring to the drawings and give an actual example that the present invention is described in more detail.It should be understood that described herein specific
Implement only to explain the present invention, be not intended to limit the present invention.
Step 1, using the video camera of unmanned plane carry to front side shooting figure piece sequence;
Camera intrinsic parameter is demarcated first:Camera calibration is substantially a process for determining camera interior and exterior parameter,
The demarcation of wherein inner parameter refers to the inner geometry and optical parametric for determining that video camera is intrinsic, unrelated with location parameter, bag
Include picture centre coordinate, focal length, scale factor and lens distortion etc..In the present invention plant protection unmanned plane equipment DVB,
Inertial Measurement Unit (IMU), video camera, graphic transmission equipment etc..When unmanned plane carries out plant protection work, unmanned plane carry is taken the photograph
Camera is to front side shooting figure piece sequence, the purpose for the arrangement is that elapsed time is both needed in view of collection, transmission image and calculating,
In order to ensure upgrading in time for surely high data, therefore use the strategy of video camera forward sight.
The method for the video camera that Zhang Zhengyou is proposed facilitates easy to operate, moderate accuracy, the quilt in Calibration of camera intrinsic parameters
Widely used (Zhang Z.A Flexible New Technique for Camera Calibration [J] .IEEE
Transactions on Pattern Analysis&Machine Intelligence,2000,22(11):1330-
1334.).In the method, it is desirable to which video camera shoots a plane target drone, video camera and 2D targets in two or more different azimuth
It can move freely through, it is not necessary to know kinematic parameter.In calibration process, it is assumed that intrinsic parameters of the camera is constant all the time,
No matter i.e. video camera is from any angle shot target, intrinsic parameters of the camera is all constant, and only external parameter changes.
The embodiment of the present invention utilizes Zhang Zhengyou method calibrating camera intrinsic parameters, and video camera shoots 15 width figures of different azimuth
Picture, in order to improve stated accuracy, reduces the size of random error, and the image of acquisition is distributed in the range of each in visual field, together
When have the depth of certain size on shooting distance, the placed angle of target etc. will also have abundant change.
Step 2, feature point extraction is carried out to the photo current frame of shooting:
2.1, as shown in figure 1, using SILC image segmentation algorithms, super-pixel segmentation is carried out to kth frame image, and calculate every
The center-of-mass coordinate of individual region unit, obtains point set I1;
2.2, the angular coordinate of kth frame image is extracted using Harris Corner Detections, point set I is obtained2, and merge I1And I2,
Obtain feature point set I to be matched somebody with somebodyp;
Step 3, Feature Points Matching:Using SAD algorithms, in the two field picture of kth -1, point set I is detectedpIn each put it is corresponding
Match point, obtains point set Ic, it is specific as follows:
As shown in Fig. 2 choosing a N × n-quadrant on characteristic point periphery is used as template.It is then attempt in the two field picture of kth -1
In find the matching of the template.In order to reduce search space, this region of search is limited to (Nsearch×Nsearch) characteristic point week
Region is enclosed, the similarity measurement of matching is represented with following formula.
SAD(dx,dy)=∑ | template (x, y)-imagek (x+dx,y+dy)|
In formula, x, y is characterized coordinate a little, dx, and dy is relative x, y coordinate offset;
Change search window centre point position,, should when similarity measurement is minimum to reduce similarity measurement to greatest extent
Point is match point.
Step 4, if n=2, cross bearing is used;According to the same place of matching to set, the position of aircraft, posture and
The angle of video camera, calculates the three-dimensional coordinate of each same place;
As shown in figure 3, setting aerial C1And C2Two points are photographed on a surface target, obtain a cubic phase pair, ground target point
Picture points of the P on the photograph of left and right is p1And p2.Obviously, ray C of the same name1p1And C2p2Intersect at ground target point P.
According to perspective projection imaging relation, C can be derived1And C2Imaging collinearity equation be respectively:
Wherein, (xi,yi), i=1,2 is P point actual imaging point coordinates;(Fx,Fy) it is equivalent focal length;(Cx,Cy) it is as main
Point coordinates;It is P points in CiThe coordinate taken the photograph under the camera coordinate system of station.
According to the relative pose relation of camera coordinate system and world coordinate system, it can obtain:
Wherein, (X, Y, Z) is coordinate of the target point in world coordinate system, r0~r8It is that world coordinate system and video camera are sat
The spin matrix component that mark system posture is consistent and needs;Tx, Ty, TzWorld coordinate system origin is moved on to camera coordinate system by representative
The translational movement of origin.
Using the geographic coordinate system of first measurement point as world coordinate system, then first time measurement point Tx=Ty=Tz=0,
The T of second measurement pointx, Ty, TzValue can position difference twice by satellite positioning receiver and calculate.Surveyed by inertia
Unit and camera cradle head are measured, azimuth angle alpha and the angle of site of aircraft crab angle φ, angle of pitch γ, roll angle θ and video camera is obtained
β, can be obtained:
Simultaneous can solve point P coordinate (X, Y, Z) with equation.
Step 5, if n>2, using the multistation location based on weighted least square, calculate the three-dimensional seat of each same place
Mark
Above-mentioned localization method, result of calculation is very sensitive to various noises, using the multiple measurement to target point of the same name, utilizes
Multistation location model based on weighted least square, by optimal derivation algorithm, improves precision and the Shandong of location algorithm
Rod.
Unmanned plane carries out n (n in the flight course of preset flight path to target point>2) secondary shooting, obtains n images.
Then according to collinearity equation, have
Z=H (S)
Wherein:Z=[x1 y1 ... xn yn]T, S=[X, Y, Z]T
Above formula is subjected to first order Taylor expansion at initial value, can be obtained
Z=H (S0)+B·(S-S0)+Δn
Wherein:
Order
U=Z-H (S0)
V=S-S0
Therefore,
U=BV+ Δs n
According to least-squares estimation, it can obtain
Unmanned plane is in each measurement point, and the posture of aircraft is all different, in this case, even if using same shooting
Machine, but be due to that external parameters of cameras is different, cause each measurement spot placement accuracy difference, the contribution to error is also different.Cause
This, introduces weighted least square.
Make R-1For weighting matrix, and
Then
Therefore,
S0It can be tried to achieve according to cross bearing principle.Because the site error of initial value is larger, add what linearisation was brought
Error so that try to achieve for the first timeAnd actual value deviation is larger.Using iterative method, when positioning result tends to stationary value, iteration knot
Beam.
The estimation of weighting matrix is more difficult, generally selects diagonal matrix or simpler unit matrix, although selection
Weight matrix have error, but unbiased esti-mator is still to the weighted least square of unknown parameter.Present invention employs one kind
The method of convenient, science obtains weight matrix, and good effect is obtained in practicality.Its core concept is:For causing error
Larger measurement point, gives less weights, and the less measurement point of error gives larger weights, so as to increase preferably
" contribution " of measurement point, improves the precision of least-squares estimation.In position fixing process, measurement point distance objective point is more remote, positioning
Precision is poorer, and measurement point and the distance of target point are pointed to angle by the elevation and measurement point camera optical axis of measurement point and determined jointly
It is fixed, meet basic triangle relation.It can thus be concluded that,
Wherein, σ is the element in weight matrix, and ε is that camera optical axis points to angle, and H is measurement point height.
Claims (5)
1. a kind of plant protection unmanned plane determines high-tech, it is characterised in that comprise the following steps that:
Step 1, ground scenery is shot to front side using the video camera of unmanned plane carry, and constant duration preserves sequence of pictures;
Step 2, feature point extraction is carried out to the photo current frame of shooting;
Step 3, it is reverse from the sequence of pictures of preservation to extract n frame pictures, feature is carried out to the characteristic point of extraction using SAD algorithms
Point matching;
Step 4, if n=2, cross bearing is used;According to the same place of matching to set, the position of aircraft, posture and shooting
The angle of machine, calculates the three-dimensional coordinate of each same place;
Step 5, if n>2, using the multistation location based on weighted least square, calculate the three-dimensional coordinate of each same place.
2. a kind of plant protection unmanned plane according to claim 1 determines high-tech, it is characterised in that described step 2 it is specific
Step is as follows:
2.1, using SILC image segmentation algorithms, super-pixel segmentation is carried out to kth frame image, and calculate the barycenter of each region unit
Coordinate, obtains point set I1;
2.2, the angular coordinate of kth frame image is extracted using Harris Corner Detections, point set I is obtained2, and merge I1And I2, obtain
Feature point set I to be matched somebody with somebodyp。
3. a kind of plant protection unmanned plane according to claim 2 determines high-tech, it is characterised in that described step 3 it is specific
Step is as follows:
3.1, characteristic point periphery chooses a N × n-quadrant and is used as template;
3.2, the matching of the template is then found in the two field picture of kth -1, wherein region of search is limited to (Nsearch×Nsearch)
Characteristic point peripheral region;The similarity measurement of matching is represented with following formula:
SAD(dx,dy)=∑ | template (x, y)-imagek+1 (x+dx,y+dy)|
In formula, x, y is characterized coordinate a little, dx, and dy is relative x, y coordinate offset;
3.3, change search window centre point position,, should when similarity measurement is minimum to reduce similarity measurement to greatest extent
Point is match point;
<mrow>
<munder>
<mrow>
<mi>arg</mi>
<mi>min</mi>
</mrow>
<mrow>
<msub>
<mi>d</mi>
<mi>x</mi>
</msub>
<mo>,</mo>
<msub>
<mi>d</mi>
<mi>y</mi>
</msub>
</mrow>
</munder>
<mrow>
<mo>(</mo>
<mi>S</mi>
<mi>A</mi>
<mi>D</mi>
<mo>(</mo>
<mrow>
<msub>
<mi>d</mi>
<mi>x</mi>
</msub>
<mo>,</mo>
<msub>
<mi>d</mi>
<mi>y</mi>
</msub>
</mrow>
<mo>)</mo>
<mo>)</mo>
</mrow>
</mrow>
4. a kind of plant protection unmanned plane according to claim 3 determines high-tech, it is characterised in that described step 4 it is specific
Step is cross bearing, specific as follows:
If aerial C1And C2Two points are photographed on a surface target, and picture points of the ground target point P on the photograph of left and right is p1And p2;
According to perspective projection imaging relation, C is derived1And C2Imaging collinearity equation be respectively:
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<msub>
<mi>x</mi>
<mn>2</mn>
</msub>
<mo>=</mo>
<msub>
<mi>F</mi>
<mi>x</mi>
</msub>
<mfrac>
<msub>
<mi>X</mi>
<msub>
<mi>C</mi>
<mn>2</mn>
</msub>
</msub>
<msub>
<mi>Z</mi>
<msub>
<mi>C</mi>
<mn>2</mn>
</msub>
</msub>
</mfrac>
<mo>+</mo>
<msub>
<mi>C</mi>
<mi>x</mi>
</msub>
<mo>+</mo>
<msub>
<mi>&delta;</mi>
<msub>
<mi>x</mi>
<mn>2</mn>
</msub>
</msub>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>y</mi>
<mn>2</mn>
</msub>
<mo>=</mo>
<msub>
<mi>F</mi>
<mi>y</mi>
</msub>
<mfrac>
<msub>
<mi>Y</mi>
<msub>
<mi>C</mi>
<mn>2</mn>
</msub>
</msub>
<msub>
<mi>Z</mi>
<msub>
<mi>C</mi>
<mn>2</mn>
</msub>
</msub>
</mfrac>
<mo>+</mo>
<msub>
<mi>C</mi>
<mi>y</mi>
</msub>
<mo>+</mo>
<msub>
<mi>&delta;</mi>
<msub>
<mi>y</mi>
<mn>2</mn>
</msub>
</msub>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
Wherein, (xi,yi), i=1,2 is P point actual imaging point coordinates;(Fx,Fy) it is equivalent focal length;(Cx,Cy) sat for principal point
Mark;It is P points in CiThe coordinate taken the photograph under the camera coordinate system of station;
According to the relative pose relation of camera coordinate system and world coordinate system, it can obtain:
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<msub>
<mi>X</mi>
<mi>c</mi>
</msub>
<mo>=</mo>
<msub>
<mi>r</mi>
<mn>0</mn>
</msub>
<mi>X</mi>
<mo>+</mo>
<msub>
<mi>r</mi>
<mn>1</mn>
</msub>
<mi>Y</mi>
<mo>+</mo>
<msub>
<mi>r</mi>
<mn>2</mn>
</msub>
<mi>Z</mi>
<mo>+</mo>
<msub>
<mi>T</mi>
<mi>X</mi>
</msub>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>Y</mi>
<mi>c</mi>
</msub>
<mo>=</mo>
<msub>
<mi>r</mi>
<mn>3</mn>
</msub>
<mi>X</mi>
<mo>+</mo>
<msub>
<mi>r</mi>
<mn>4</mn>
</msub>
<mi>Y</mi>
<mo>+</mo>
<msub>
<mi>r</mi>
<mn>5</mn>
</msub>
<mi>Z</mi>
<mo>+</mo>
<msub>
<mi>T</mi>
<mi>Y</mi>
</msub>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>Z</mi>
<mi>c</mi>
</msub>
<mo>=</mo>
<msub>
<mi>r</mi>
<mn>6</mn>
</msub>
<mi>X</mi>
<mo>+</mo>
<msub>
<mi>r</mi>
<mn>7</mn>
</msub>
<mi>Y</mi>
<mo>+</mo>
<msub>
<mi>r</mi>
<mn>8</mn>
</msub>
<mi>Z</mi>
<mo>+</mo>
<msub>
<mi>T</mi>
<mi>Z</mi>
</msub>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
Wherein, (X, Y, Z) is coordinate of the target point in world coordinate system, r0~r8For world coordinate system and camera coordinate system
The spin matrix component that posture is consistent and needs;Tx, Ty, TzWorld coordinate system origin is moved on to camera coordinate system origin by representative
Translational movement;
Using the geographic coordinate system of first measurement point as world coordinate system, then first time measurement point Tx=Ty=Tz=0, second
The T of individual measurement pointx, Ty, TzValue difference positioned by satellite positioning receiver twice calculated;
By Inertial Measurement Unit and camera cradle head, aircraft crab angle φ, angle of pitch γ, roll angle θ and video camera are obtained
Azimuth angle alpha and angle of site β, can be obtained:
<mfenced open = "" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<mi>R</mi>
<mo>=</mo>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>r</mi>
<mn>0</mn>
</msub>
</mtd>
<mtd>
<msub>
<mi>r</mi>
<mn>1</mn>
</msub>
</mtd>
<mtd>
<msub>
<mi>r</mi>
<mn>2</mn>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>r</mi>
<mn>3</mn>
</msub>
</mtd>
<mtd>
<msub>
<mi>r</mi>
<mn>4</mn>
</msub>
</mtd>
<mtd>
<msub>
<mi>r</mi>
<mn>5</mn>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>r</mi>
<mn>6</mn>
</msub>
</mtd>
<mtd>
<msub>
<mi>r</mi>
<mn>7</mn>
</msub>
</mtd>
<mtd>
<msub>
<mi>r</mi>
<mn>8</mn>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>=</mo>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<mrow>
<mi>cos</mi>
<mi>&phi;</mi>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>sin</mi>
<mi>&phi;</mi>
</mrow>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>-</mo>
<mi>sin</mi>
<mi>&phi;</mi>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>cos</mi>
<mi>&phi;</mi>
</mrow>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>1</mn>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>&CenterDot;</mo>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<mrow>
<mi>cos</mi>
<mi>&gamma;</mi>
</mrow>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mrow>
<mo>-</mo>
<mi>sin</mi>
<mi>&gamma;</mi>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>1</mn>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mi>sin</mi>
<mi>&gamma;</mi>
</mrow>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mrow>
<mi>cos</mi>
<mi>&gamma;</mi>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>&CenterDot;</mo>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<mn>1</mn>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mrow>
<mi>cos</mi>
<mi>&theta;</mi>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>sin</mi>
<mi>&theta;</mi>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mrow>
<mo>-</mo>
<mi>sin</mi>
<mi>&theta;</mi>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>cos</mi>
<mi>&theta;</mi>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>&CenterDot;</mo>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<mrow>
<mi>cos</mi>
<mi>&alpha;</mi>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>sin</mi>
<mi>&alpha;</mi>
</mrow>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>-</mo>
<mi>sin</mi>
<mi>&alpha;</mi>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>cos</mi>
<mi>&alpha;</mi>
</mrow>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>1</mn>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>&CenterDot;</mo>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<mrow>
<mi>cos</mi>
<mi>&beta;</mi>
</mrow>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mrow>
<mo>-</mo>
<mi>sin</mi>
<mi>&beta;</mi>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>1</mn>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mi>sin</mi>
<mi>&beta;</mi>
</mrow>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mrow>
<mi>cos</mi>
<mi>&beta;</mi>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
Simultaneous can solve point P coordinate (X, Y, Z) with equation.
5. a kind of plant protection unmanned plane according to claim 3 determines high-tech, it is characterised in that described step 5 is specific such as
Under:
Unmanned plane carries out n (n in the flight course of preset flight path to target point>2) secondary shooting, obtains n images;
Then according to collinearity equation, draw:
Z=H (S)
Wherein:Z=[x1 y1 ... xn yn]T, S=[X, Y, Z]T
<mrow>
<mi>H</mi>
<mrow>
<mo>(</mo>
<mi>S</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<mrow>
<msub>
<mi>h</mi>
<mn>0</mn>
</msub>
<mrow>
<mo>(</mo>
<mi>X</mi>
<mo>,</mo>
<mi>Y</mi>
<mo>,</mo>
<mi>Z</mi>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>h</mi>
<mn>1</mn>
</msub>
<mrow>
<mo>(</mo>
<mi>X</mi>
<mo>,</mo>
<mi>Y</mi>
<mo>,</mo>
<mi>Z</mi>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mo>.</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mo>.</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mo>.</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>h</mi>
<mrow>
<mn>2</mn>
<mi>n</mi>
<mo>-</mo>
<mn>2</mn>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mi>X</mi>
<mo>,</mo>
<mi>Y</mi>
<mo>,</mo>
<mi>Z</mi>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>h</mi>
<mrow>
<mn>2</mn>
<mi>n</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mi>X</mi>
<mo>,</mo>
<mi>Y</mi>
<mo>,</mo>
<mi>Z</mi>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>=</mo>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<mrow>
<msub>
<mi>F</mi>
<mi>x</mi>
</msub>
<mfrac>
<msub>
<mi>X</mi>
<msub>
<mi>C</mi>
<mn>1</mn>
</msub>
</msub>
<msub>
<mi>Z</mi>
<msub>
<mi>C</mi>
<mn>1</mn>
</msub>
</msub>
</mfrac>
<mo>+</mo>
<msub>
<mi>C</mi>
<mi>x</mi>
</msub>
<mo>+</mo>
<msub>
<mi>&delta;</mi>
<msub>
<mi>x</mi>
<mn>1</mn>
</msub>
</msub>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>F</mi>
<mi>y</mi>
</msub>
<mfrac>
<msub>
<mi>Y</mi>
<msub>
<mi>C</mi>
<mn>1</mn>
</msub>
</msub>
<msub>
<mi>Z</mi>
<msub>
<mi>C</mi>
<mn>1</mn>
</msub>
</msub>
</mfrac>
<mo>+</mo>
<msub>
<mi>C</mi>
<mi>y</mi>
</msub>
<mo>+</mo>
<msub>
<mi>&delta;</mi>
<msub>
<mi>y</mi>
<mn>1</mn>
</msub>
</msub>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mo>.</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mo>.</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mo>.</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>F</mi>
<mi>x</mi>
</msub>
<mfrac>
<msub>
<mi>X</mi>
<msub>
<mi>C</mi>
<mi>n</mi>
</msub>
</msub>
<msub>
<mi>Z</mi>
<msub>
<mi>C</mi>
<mi>n</mi>
</msub>
</msub>
</mfrac>
<mo>+</mo>
<msub>
<mi>C</mi>
<mi>x</mi>
</msub>
<mo>+</mo>
<msub>
<mi>&delta;</mi>
<msub>
<mi>x</mi>
<mi>n</mi>
</msub>
</msub>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>F</mi>
<mi>y</mi>
</msub>
<mfrac>
<msub>
<mi>Y</mi>
<msub>
<mi>C</mi>
<mi>n</mi>
</msub>
</msub>
<msub>
<mi>Z</mi>
<msub>
<mi>C</mi>
<mi>n</mi>
</msub>
</msub>
</mfrac>
<mo>+</mo>
<msub>
<mi>C</mi>
<mi>y</mi>
</msub>
<mo>+</mo>
<msub>
<mi>&delta;</mi>
<msub>
<mi>y</mi>
<mi>n</mi>
</msub>
</msub>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
</mrow>
Above formula is subjected to first order Taylor expansion at initial value, can be obtained:
Z=H (S0)+B·(S-S0)+Δn
Wherein:
<mrow>
<mi>B</mi>
<mo>=</mo>
<msub>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<mrow>
<mfrac>
<mrow>
<mo>&part;</mo>
<msub>
<mi>h</mi>
<mn>0</mn>
</msub>
</mrow>
<mrow>
<mo>&part;</mo>
<mi>X</mi>
</mrow>
</mfrac>
<msub>
<mo>|</mo>
<mn>0</mn>
</msub>
</mrow>
</mtd>
<mtd>
<mrow>
<mfrac>
<mrow>
<mo>&part;</mo>
<msub>
<mi>h</mi>
<mn>0</mn>
</msub>
</mrow>
<mrow>
<mo>&part;</mo>
<mi>Y</mi>
</mrow>
</mfrac>
<msub>
<mo>|</mo>
<mn>0</mn>
</msub>
</mrow>
</mtd>
<mtd>
<mrow>
<mfrac>
<mrow>
<mo>&part;</mo>
<msub>
<mi>h</mi>
<mn>0</mn>
</msub>
</mrow>
<mrow>
<mo>&part;</mo>
<mi>Z</mi>
</mrow>
</mfrac>
<msub>
<mo>|</mo>
<mn>0</mn>
</msub>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mfrac>
<mrow>
<mo>&part;</mo>
<msub>
<mi>h</mi>
<mn>1</mn>
</msub>
</mrow>
<mrow>
<mo>&part;</mo>
<mi>X</mi>
</mrow>
</mfrac>
<msub>
<mo>|</mo>
<mn>0</mn>
</msub>
</mrow>
</mtd>
<mtd>
<mrow>
<mfrac>
<mrow>
<mo>&part;</mo>
<msub>
<mi>h</mi>
<mn>1</mn>
</msub>
</mrow>
<mrow>
<mo>&part;</mo>
<mi>Y</mi>
</mrow>
</mfrac>
<msub>
<mo>|</mo>
<mn>0</mn>
</msub>
</mrow>
</mtd>
<mtd>
<mrow>
<mfrac>
<mrow>
<mo>&part;</mo>
<msub>
<mi>h</mi>
<mn>1</mn>
</msub>
</mrow>
<mrow>
<mo>&part;</mo>
<mi>Z</mi>
</mrow>
</mfrac>
<msub>
<mo>|</mo>
<mn>0</mn>
</msub>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mfrac>
<mrow>
<mo>&part;</mo>
<msub>
<mi>h</mi>
<mrow>
<mn>2</mn>
<mi>n</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msub>
</mrow>
<mrow>
<mo>&part;</mo>
<mi>X</mi>
</mrow>
</mfrac>
<msub>
<mo>|</mo>
<mn>0</mn>
</msub>
</mrow>
</mtd>
<mtd>
<mrow>
<mfrac>
<mrow>
<mo>&part;</mo>
<msub>
<mi>h</mi>
<mrow>
<mn>2</mn>
<mi>n</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msub>
</mrow>
<mrow>
<mo>&part;</mo>
<mi>Y</mi>
</mrow>
</mfrac>
<msub>
<mo>|</mo>
<mn>0</mn>
</msub>
</mrow>
</mtd>
<mtd>
<mrow>
<mfrac>
<mrow>
<mo>&part;</mo>
<msub>
<mi>h</mi>
<mrow>
<mn>2</mn>
<mi>n</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msub>
</mrow>
<mrow>
<mo>&part;</mo>
<mi>Z</mi>
</mrow>
</mfrac>
<msub>
<mo>|</mo>
<mn>0</mn>
</msub>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mrow>
<mn>2</mn>
<mi>n</mi>
<mo>&times;</mo>
<mn>3</mn>
</mrow>
</msub>
</mrow>
Order
U=Z-H (S0)
V=S-S0
Therefore,
U=BV+ Δs n
According to least-squares estimation, it can obtain
<mrow>
<mover>
<mi>V</mi>
<mo>^</mo>
</mover>
<mo>=</mo>
<msup>
<mrow>
<mo>(</mo>
<msup>
<mi>B</mi>
<mi>T</mi>
</msup>
<mi>B</mi>
<mo>)</mo>
</mrow>
<mrow>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msup>
<msup>
<mi>B</mi>
<mi>T</mi>
</msup>
<mi>U</mi>
</mrow>
Introduce weighted least square;Make R-1For weighting matrix, and
<mrow>
<mi>R</mi>
<mo>=</mo>
<msub>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>&sigma;</mi>
<msub>
<mi>x</mi>
<mn>1</mn>
</msub>
</msub>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>...</mn>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<msub>
<mi>&sigma;</mi>
<msub>
<mi>y</mi>
<mn>1</mn>
</msub>
</msub>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>...</mn>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<msub>
<mi>&sigma;</mi>
<msub>
<mi>x</mi>
<mn>2</mn>
</msub>
</msub>
</mtd>
<mtd>
<mn>...</mn>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
</mtr>
<mtr>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
<mtd>
<mo>.</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>...</mn>
</mtd>
<mtd>
<msub>
<mi>&sigma;</mi>
<msub>
<mi>y</mi>
<mrow>
<mn>2</mn>
<mi>n</mi>
</mrow>
</msub>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
<mrow>
<mn>2</mn>
<mi>n</mi>
<mo>&times;</mo>
<mn>2</mn>
<mi>n</mi>
</mrow>
</msub>
</mrow>
Then
<mrow>
<mover>
<mi>V</mi>
<mo>^</mo>
</mover>
<mo>=</mo>
<msup>
<mrow>
<mo>(</mo>
<msup>
<mi>B</mi>
<mi>T</mi>
</msup>
<msup>
<mi>R</mi>
<mrow>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msup>
<mi>B</mi>
<mo>)</mo>
</mrow>
<mrow>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msup>
<msup>
<mi>B</mi>
<mi>T</mi>
</msup>
<msup>
<mi>R</mi>
<mrow>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msup>
<mi>U</mi>
</mrow>
Therefore,
<mrow>
<mover>
<mi>S</mi>
<mo>^</mo>
</mover>
<mo>=</mo>
<msup>
<mrow>
<mo>(</mo>
<msup>
<mi>B</mi>
<mi>T</mi>
</msup>
<msup>
<mi>R</mi>
<mrow>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msup>
<mi>B</mi>
<mo>)</mo>
</mrow>
<mrow>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msup>
<msup>
<mi>B</mi>
<mi>T</mi>
</msup>
<msup>
<mi>R</mi>
<mrow>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msup>
<mrow>
<mo>(</mo>
<mi>Z</mi>
<mo>-</mo>
<mi>H</mi>
<mo>(</mo>
<msup>
<mi>S</mi>
<mn>0</mn>
</msup>
<mo>)</mo>
<mo>)</mo>
</mrow>
<mo>+</mo>
<msup>
<mi>S</mi>
<mn>0</mn>
</msup>
</mrow>
S0It can be tried to achieve according to cross bearing principle;Using iterative method, when positioning result tends to stationary value, iteration terminates;
For causing the larger measurement point of error, less weights are given, the less measurement point of error gives larger weights;
It can thus be concluded that,
<mrow>
<mi>&sigma;</mi>
<mo>&Proportional;</mo>
<mfrac>
<mrow>
<mi>c</mi>
<mi>o</mi>
<mi>s</mi>
<mi>&epsiv;</mi>
</mrow>
<mi>H</mi>
</mfrac>
</mrow>
Wherein, σ is the element in weight matrix, and ε is that camera optical axis points to angle, and H is measurement point height.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710006131.4A CN107065895A (en) | 2017-01-05 | 2017-01-05 | A kind of plant protection unmanned plane determines high-tech |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710006131.4A CN107065895A (en) | 2017-01-05 | 2017-01-05 | A kind of plant protection unmanned plane determines high-tech |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107065895A true CN107065895A (en) | 2017-08-18 |
Family
ID=59623648
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710006131.4A Pending CN107065895A (en) | 2017-01-05 | 2017-01-05 | A kind of plant protection unmanned plane determines high-tech |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107065895A (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108140245A (en) * | 2017-12-25 | 2018-06-08 | 深圳市道通智能航空技术有限公司 | Distance measuring method, device and unmanned plane |
CN109141396A (en) * | 2018-07-16 | 2019-01-04 | 南京航空航天大学 | The UAV position and orientation estimation method that auxiliary information is merged with random sampling unification algorism |
CN109410281A (en) * | 2018-11-05 | 2019-03-01 | 珠海格力电器股份有限公司 | A kind of position control method, device, storage medium and logistics system |
CN109916406A (en) * | 2019-01-10 | 2019-06-21 | 浙江大学 | A kind of circular object localization method based on unmanned aerial vehicle group |
CN110658540A (en) * | 2019-09-18 | 2020-01-07 | 华南农业大学 | Method for testing satellite navigation automatic operation accuracy of transplanter by using unmanned aerial vehicle low-altitude flight target positioning technology |
CN111367302A (en) * | 2020-03-03 | 2020-07-03 | 大连海洋大学 | Unmanned aerial vehicle self-adaptive height-fixing method for offshore cage culture inspection |
CN111627043A (en) * | 2020-04-13 | 2020-09-04 | 浙江工业大学 | Simple human body curve acquisition method based on marker and feature filter |
CN116580099A (en) * | 2023-07-14 | 2023-08-11 | 山东艺术学院 | Forest land target positioning method based on fusion of video and three-dimensional model |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104182974A (en) * | 2014-08-12 | 2014-12-03 | 大连理工大学 | A speeded up method of executing image matching based on feature points |
CN104501779A (en) * | 2015-01-09 | 2015-04-08 | 中国人民解放军63961部队 | High-accuracy target positioning method of unmanned plane on basis of multi-station measurement |
-
2017
- 2017-01-05 CN CN201710006131.4A patent/CN107065895A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104182974A (en) * | 2014-08-12 | 2014-12-03 | 大连理工大学 | A speeded up method of executing image matching based on feature points |
CN104501779A (en) * | 2015-01-09 | 2015-04-08 | 中国人民解放军63961部队 | High-accuracy target positioning method of unmanned plane on basis of multi-station measurement |
Non-Patent Citations (5)
Title |
---|
吉淑娇 等: ""基于兴趣区域的特征匹配的电子稳像方法"", 《吉林大学学报(信息科学版)》 * |
徐诚 等: ""基于光电测量平台的多目标定位算法"", 《中南大学学报(自然科学版)》 * |
徐诚: ""无人机高精度目标定位若干关键技术研究"", 《万方数据》 * |
沈楠 等: ""一种遥感图象自动配准方法"", 《计算机工程与应用》 * |
肖靖 等: ""基于特征点的飞行器局部模板匹配"", 《信息技术》 * |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108140245A (en) * | 2017-12-25 | 2018-06-08 | 深圳市道通智能航空技术有限公司 | Distance measuring method, device and unmanned plane |
CN108140245B (en) * | 2017-12-25 | 2022-08-23 | 深圳市道通智能航空技术股份有限公司 | Distance measurement method and device and unmanned aerial vehicle |
CN109141396B (en) * | 2018-07-16 | 2022-04-26 | 南京航空航天大学 | Unmanned aerial vehicle pose estimation method with fusion of auxiliary information and random sampling consistency algorithm |
CN109141396A (en) * | 2018-07-16 | 2019-01-04 | 南京航空航天大学 | The UAV position and orientation estimation method that auxiliary information is merged with random sampling unification algorism |
CN109410281A (en) * | 2018-11-05 | 2019-03-01 | 珠海格力电器股份有限公司 | A kind of position control method, device, storage medium and logistics system |
CN109916406A (en) * | 2019-01-10 | 2019-06-21 | 浙江大学 | A kind of circular object localization method based on unmanned aerial vehicle group |
CN109916406B (en) * | 2019-01-10 | 2020-10-13 | 浙江大学 | Surrounding target positioning method based on unmanned aerial vehicle cluster |
CN110658540A (en) * | 2019-09-18 | 2020-01-07 | 华南农业大学 | Method for testing satellite navigation automatic operation accuracy of transplanter by using unmanned aerial vehicle low-altitude flight target positioning technology |
CN111367302A (en) * | 2020-03-03 | 2020-07-03 | 大连海洋大学 | Unmanned aerial vehicle self-adaptive height-fixing method for offshore cage culture inspection |
CN111367302B (en) * | 2020-03-03 | 2023-03-21 | 大连海洋大学 | Unmanned aerial vehicle self-adaptive height-fixing method for offshore cage culture inspection |
CN111627043A (en) * | 2020-04-13 | 2020-09-04 | 浙江工业大学 | Simple human body curve acquisition method based on marker and feature filter |
CN111627043B (en) * | 2020-04-13 | 2023-09-19 | 浙江工业大学 | Simple human body curve acquisition method based on markers and feature screeners |
CN116580099A (en) * | 2023-07-14 | 2023-08-11 | 山东艺术学院 | Forest land target positioning method based on fusion of video and three-dimensional model |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107065895A (en) | A kind of plant protection unmanned plane determines high-tech | |
CN112525162B (en) | System and method for measuring image distance of power transmission line by unmanned aerial vehicle | |
CN106408601B (en) | A kind of binocular fusion localization method and device based on GPS | |
Xiang et al. | Method for automatic georeferencing aerial remote sensing (RS) images from an unmanned aerial vehicle (UAV) platform | |
CN106767714B (en) | Improve the equivalent mismatch model multistage Calibration Method of satellite image positioning accuracy | |
US7751651B2 (en) | Processing architecture for automatic image registration | |
CN103047985B (en) | A kind of method for rapidly positioning of extraterrestrial target | |
CN106155081B (en) | A kind of a wide range of target monitoring of rotor wing unmanned aerial vehicle and accurate positioning method | |
CN104501779A (en) | High-accuracy target positioning method of unmanned plane on basis of multi-station measurement | |
CN105424006A (en) | Unmanned aerial vehicle hovering precision measurement method based on binocular vision | |
CN104200086A (en) | Wide-baseline visible light camera pose estimation method | |
CN108665499B (en) | Near distance airplane pose measuring method based on parallax method | |
CN106971408A (en) | A kind of camera marking method based on space-time conversion thought | |
CN109269525B (en) | Optical measurement system and method for take-off or landing process of space probe | |
CN107390704A (en) | A kind of multi-rotor unmanned aerial vehicle light stream hovering method based on IMU pose compensations | |
CN108955685A (en) | A kind of tanker aircraft tapered sleeve pose measuring method based on stereoscopic vision | |
CN106403900A (en) | Flyer tracking and locating system and method | |
CN110400330A (en) | Photoelectric nacelle image tracking method and tracking system based on fusion IMU | |
CN108007437B (en) | Method for measuring farmland boundary and internal obstacles based on multi-rotor aircraft | |
CN112184786A (en) | Target positioning method based on synthetic vision | |
CN108594223A (en) | On-board SAR image object localization method | |
Hill et al. | Ground-to-air flow visualization using Solar Calcium-K line Background-Oriented Schlieren | |
CN105403886B (en) | A kind of carried SAR scaler picture position extraction method | |
CN113340272B (en) | Ground target real-time positioning method based on micro-group of unmanned aerial vehicle | |
CN107784633A (en) | Suitable for the unmanned plane image calibrating method of plane survey |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170818 |
|
RJ01 | Rejection of invention patent application after publication |