CN103925920B - A kind of MAV indoor based on perspective image autonomous navigation method - Google Patents

A kind of MAV indoor based on perspective image autonomous navigation method Download PDF

Info

Publication number
CN103925920B
CN103925920B CN201410143275.0A CN201410143275A CN103925920B CN 103925920 B CN103925920 B CN 103925920B CN 201410143275 A CN201410143275 A CN 201410143275A CN 103925920 B CN103925920 B CN 103925920B
Authority
CN
China
Prior art keywords
image
mav
indoor
end point
flight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410143275.0A
Other languages
Chinese (zh)
Other versions
CN103925920A (en
Inventor
赵春晖
王荣志
张天武
潘泉
马鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Chenxiang Zhuoyue Technology Co ltd
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN201410143275.0A priority Critical patent/CN103925920B/en
Publication of CN103925920A publication Critical patent/CN103925920A/en
Application granted granted Critical
Publication of CN103925920B publication Critical patent/CN103925920B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/102Simultaneous control of position or course in three dimensions specially adapted for aircraft specially adapted for vertical take-off of aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A kind of based on perspective image the MAV indoor autonomous navigation method that the present invention proposes, indoor environment is divided into corridor, step and three kinds of room, environmental form is determined by the see-through feature analyzing visual pattern, and then use the methods such as end point, centrage or light stream to carry out detection of obstacles, navigated accordingly and control information, it is achieved MAV indoor automatic obstacle avoiding and flight.The present invention, without the threedimensional model of constructing environment, can be substantially reduced algorithm operation time, improves the real-time of control instruction, and has high autonomy and high reliability.The method amount of calculation is less, real-time, and the requirement to hardware is low, and positioning precision is higher.Engineer applied for MAV indoor navigation provides a kind of feasible technical scheme.

Description

A kind of MAV indoor based on perspective image autonomous navigation method
Technical field
The invention belongs to Navigation of Pilotless Aircraft technical field, be specifically related to a kind of MAV based on perspective image indoor Autonomous navigation method.
Background technology
MAV refers to that size only has the aircraft of hand size, can set as a kind of battle reconnaissance that soldier carries Standby, its latent effect includes air surveillance, biological warfare agent detection, target recognition, communication relay etc., and large-scale Interior of building detection aspect has advantageous advantage.
In high precision, highly reliable autonomous navigation technology is to ensure that MAV smoothly completes the key technology of various task One of, for strengthening MAV inner directed behavior ability, improve fighting efficiency tool and be of great significance.At present, Realizing the basic ideas of unmanned plane autonomous navigation technology is by airborne sensor, real-time perception unmanned plane oneself state and Flight space environmental information, determines unmanned plane kinestate and correlation navigation parameter by multisource information fusion technology, and Realize the functions such as the perception to surrounding, avoidance, path planning.
The new navigation skill that vision guided navigation is as computer hardware and the high speed development of image processing techniques and rises Art, this technology relates to the multi-door subjects such as optics, pattern recognition, image procossing and navigation.In vision navigation system, Carrier passes through imaging sensor perception environment, is then analyzed image by computer, obtains position and the appearance of carrier The navigation informations such as state.The features such as the autonomy of vision guided navigation, motility and adaptability make it rapidly become navigation field Study hotspot.
For MAV indoor flight environment of vehicle, it focuses on hi-Fix and spatial obstacle perception and evades, Compare inertial navigation and GPS navigation, vision guided navigation obstacle perceptible aspect have uniqueness advantage, therefore study based on The indoor independent navigation of vision is of crucial importance, and conventional method is to use high-resolution camera shooting surrounding and carry out Three-dimensionalreconstruction, determines rational flight path according to reconstruction result, but needed for this method, amount of calculation is very big, in real time Property poor, high to the requirement of hardware, and positioning precision is relatively low.
Summary of the invention
Solve the technical problem that
In place of the deficiencies in the prior art, it is indoor that the present invention proposes a kind of MAV based on perspective image Autonomous navigation method, it is achieved independent navigation under MAV indoor environment.
Technical scheme
A kind of MAV indoor based on perspective image autonomous navigation method, it is characterised in that step is as follows:
Step 1: in the flight course of MAV indoor, Air-borne Forward-looking video camera Camera1 real-time image acquisition a, Airborne lower regarding video camera Camera2 real-time image acquisition b;
Step 2: analyze the see-through feature of image a and image b, it is judged that the environmental form residing for MAV, ring Type corridor, border, step or room;Process is as follows:
A, use Hough transform carry out straight-line detection to image a and obtain image c, it is judged that whether have disappearance in image c Point;Described end point is to have the point of see-through feature;
If B has end point, it is determined that indoor environment type is corridor, if without end point, then remove the water in image c Horizontal line and vertical line obtain image d, then judge whether have end point in image d;
If C has end point, judge that indoor environment type, as corridor, if without end point, then uses Hough transform equally Image b is carried out straight-line detection and obtains image e, it is judged that whether image e has the equidistant parallel lines of cluster;
If D has the equidistant parallel lines of cluster, it is determined that indoor environment type is step, if equidistantly putting down without cluster Line, it is determined that indoor environment type is room;
Step 3: for varying environment type, flight path barrier is detected by Use barriers detection algorithm, mistake Journey is as follows:
A, for corridor environment, the definition 50 × 50 brightness of image averages centered by end point as judgment standard, if Current frame image and previous frame image luminance mean value difference are less than threshold value Threshold2=30, it is believed that accessible, otherwise it is assumed that There is obstacle;
B, for step environment, a length of judgment standard of nose in definition image, if current frame image is with previous The length difference of two field picture nose is less than threshold value Threshold3=15, it is believed that accessible, otherwise it is assumed that there is obstacle;
Step 4: when testing result is accessible, then realize micro-by the flight control method under varying environment type Type unmanned plane indoor autonomous flight;
Step 5: if detection of obstacles result is for there being obstacle, then realize MAV by obstacle avoidance method autonomous Avoidance.
The flight control method under varying environment type described in step 4 is:
Situation 1, for corridor environment, definition end point coordinate in the picture isIfLess than picture altitude Half, then controller sends climb command to MAV, ifMore than the half of picture altitude, then controller MAV is sent decline instruction, otherwise keeps the most constant, ifLess than the half of picture traverse, then control Device sends flight directive to the left to MAV, ifMore than the half of picture traverse, then controller is to micro-unmanned Machine sends flight directive to the right, otherwise keeps direction constant;
Situation 2, for step environment, the distance defined between adjacent two parallel lines is dis, if current frame image Dis need to set according to accuracy requirement more than this threshold value of threshold value Threshold4(with the dis difference of previous frame image, and principle is Less than picture altitude 1/10th), then controller sends climb command to MAV, if the dis of previous frame image Be more than threshold value Threshold4 with the dis difference of current frame image, then controller sends decline instruction to MAV, Otherwise keeping the most constant, the abscissa defining the longest straight line midpoint isIf,Less than the half of picture traverse, then Controller sends flight directive to the left to MAV, ifMore than the half of picture traverse, then controller is to miniature Unmanned plane sends flight directive to the right, otherwise keeps direction constant.
Described in step 5, obstacle avoidance method is:
Step 1: use Lucas-Kanade algorithm to calculate the light stream of two continuous frames image;
Step 2: utilize expanded Kalman filtration algorithm to process light stream and the angular velocity of inertial navigation offer, eliminate light The rotational component of stream, accurately estimates translational component;
Step 3: utilize the translational component of light stream to recover the translational motion information of MAV and the distance letter of barrier Breath;
Step 4: according to the range information of barrier, controller sends the instruction of corresponding avoidance to MAV.
Beneficial effect
A kind of based on perspective image the MAV indoor autonomous navigation method that the present invention proposes, according to perspective image Come from the visual experience of the mankind: the object that size is identical, near seem bigger, as seen two along railway line than remote Rail can be intersected in a bit, referred to as end point.In indoor environment is such as corridor, generally there are end point, according to disappearance The characteristic of point can instruct MAV perception surrounding and then realize autonomous flight.
Indoor environment, with reference to the principle of vision guided navigation, is divided into corridor, step and three kinds of room by the present invention, by analyzing The see-through feature of visual pattern determines environmental form, and then uses the methods such as end point, centrage or light stream to carry out obstacle Analyte detection, is navigated and control information, it is achieved MAV indoor automatic obstacle avoiding and flight accordingly.The present invention without Need the threedimensional model of constructing environment, algorithm operation time can be substantially reduced, improve the real-time of control instruction, and have High autonomy and high reliability.The method amount of calculation is less, real-time, and the requirement to hardware is low, and positioning precision Higher.Engineer applied for MAV indoor navigation provides a kind of feasible technical scheme.
Accompanying drawing explanation
Fig. 1 is the frame construction drawing of the present invention.
Fig. 2 is indoor environment grader workflow.
Fig. 3 is course angle correction under step environment.
Detailed description of the invention
In conjunction with embodiment, accompanying drawing, the invention will be further described:
In the embodiment of the present invention, MAV indoor based on perspective image autonomous navigation method comprises the following steps:
The first step, in the flight course of MAV indoor, Air-borne Forward-looking video camera Camera1 real-time image acquisition a, Airborne lower regarding video camera Camera2 real-time image acquisition b.
Second step, designs indoor environment grader, by analyzing the see-through feature of image a and image b, it is judged that miniature Environmental form residing for unmanned plane.
Idiographic flow is as in figure 2 it is shown, process image a first by Canny edge detection operator, and then utilizes Hough Straight line is extracted in conversion, removes the horizontal line and vertical line detected, determines image bend intersection point density by sliding window method Maximum region Z, and think that this region is end point.
The density at definition sliding window midpoint is:
ρ window = N S window
The density at image midpoint is:
ρ image = K S image
Wherein N represents the quantity of region Z bend intersection point, and K represents the quantity of whole image bend intersection point point, SwindowRepresent the area of region Z, SimageRepresent the area of image.
Definition end point criterion is as follows:
Set Threshold1=30.
If end point exists, it is determined that MAV local environment is corridor, and the coordinate defining end point is
( x ‾ , y ‾ ) = 1 N Σ i ∈ N ( x i , y i )
Wherein (xi,yi) represent Z midpoint, region coordinate.
If end point does not exists, utilize Hough transform that image b is carried out straight-line detection, it may be judged whether to have cluster Equidistantly parallel lines, if there being the equidistant parallel lines of cluster, it is determined that indoor environment type is step, if without cluster etc. Spaced parallel lines, it is determined that indoor environment type is room.
3rd step, for varying environment type, flight path barrier is detected by Use barriers detection algorithm.
Under corridor environment, if there is barrier, the brightness near end point can occur significantly to change, and uses pixel equal Value characterizes brightness, is defined as
B = 1 w × h Σ i = 1 i = h Σ j = 1 j = w I ( i , j )
W, h are respectively width and the height of image, and (i j) represents image (i, j) gray value that place is corresponding to I.
Definition differentiating obstacle criterion is as follows, threshold value Threshold2=30
Under step environment, if there is barrier, the nose length that Hough transform is extracted can occur significantly to change, if t The a length of l of nose in moment picturet, definition differentiating obstacle criterion is as follows, threshold value Threshold3=15
Under room environment, it is believed that barrier exists always.
4th step, if detection of obstacles result is accessible, then real by the Flight Control Algorithm under varying environment type Existing MAV indoor autonomous flight.
For corridor environment, definition end point coordinate in the picture isIf,Less than the half of picture altitude, Then controller sends climb command to MAV, ifMore than the half of picture altitude, then controller is to miniature nothing Man-machine sending declines instruction, otherwise keeps the most constant, ifLess than the half of picture traverse, then controller is to miniature Unmanned plane sends flight directive to the left, ifMore than the half of picture traverse, then controller MAV is sent to Right flight directive, otherwise keeps direction constant;For step environment, the distance defined between adjacent two parallel lines is Dis, if the dis difference of the dis of current frame image and previous frame image needs basis more than this threshold value of threshold value Threshold4( Accuracy requirement sets, and principle is 1/10th less than picture altitude), then controller sends rising to MAV and refers to Order, if the dis difference of the dis of previous frame image and current frame image is more than threshold value Threshold4, then controller is to miniature Unmanned plane sends decline instruction, otherwise keeps the most constant, and the abscissa defining the longest straight line midpoint isIf,Little In the half of picture traverse, then controller sends flight directive to the left to MAV, ifMore than picture traverse Half, then controller sends flight directive to the right to MAV, otherwise keeps direction constant.
5th step, if detection of obstacles result is for there being obstacle, then realizes MAV by obstacle avoidance algorithm autonomous Avoidance.
When unmanned plane is kept in motion, Lucas-Kanade algorithm is used to calculate the light stream of two continuous frames image, profit Process light stream and the angular velocity of inertial navigation offer by expanded Kalman filtration algorithm, eliminate the rotational component of light stream, Accurately estimate translational component, utilize the translational component of light stream to recover translational motion information and the barrier of MAV Range information, according to the range information of barrier, controller MAV is sent corresponding avoidance instruction.
Specific embodiment is as follows:
1, in the flight course of MAV indoor, Air-borne Forward-looking video camera Camera1 real-time image acquisition a, airborne Regard down video camera Camera2 real-time image acquisition b.
Utilize unmanned aerial vehicle onboard forward sight optical camera and lower optometry video camera real-time image acquisition sequence, but only need to protect Deposit present frame and previous frame image.
2, design indoor environment grader, by carrying out pretreatment to image a and image b, it is judged that MAV institute The environmental form at place.
Idiographic flow is as in figure 2 it is shown, process image a first by Canny edge detection operator, and then utilizes Hough Straight line is extracted in conversion, owing to end point is the intersection point of image bend, needs to remove the horizontal line detected and vertical line, In view of lines detection exists error, therefore the intersection point of image bend is not all to cross at end point, defines intersection point The region Z of density maximum is end point, is determined by the method for sliding window.
The density at definition sliding window midpoint is:
ρ window = N S window
The density at image midpoint is:
ρ image = K S image
Wherein N represents the quantity at Z midpoint, region, and K represents the quantity at whole image midpoint, SwindowRepresent region Z's Area, SimageRepresent the area of image.
Definition end point criterion is as follows:
Set Threshold1=30.
If end point exists, it is determined that MAV local environment is corridor, and the coordinate defining end point is
( x ‾ , y ‾ ) = 1 N Σ i ∈ N ( x i , y i )
Wherein (xi,yi) represent Z midpoint, region coordinate.
If end point does not exists, utilize Hough transform that image b is carried out straight-line detection, it may be judged whether to have cluster Equidistantly parallel lines, if there being the equidistant parallel lines of cluster, it is determined that indoor environment type is step, if without cluster etc. Spaced parallel lines, it is determined that indoor environment type is room.
3, for varying environment type, flight path barrier is detected by Use barriers detection algorithm.
Under corridor environment, if there is barrier, the brightness near end point can occur significantly to change, and uses pixel equal Value characterizes brightness, is defined as
B = 1 w × h Σ i = 1 i = h Σ j = 1 j = w I ( i , j )
W, h are respectively width and the height of image, and (i j) represents image (i, j) gray value that place is corresponding to I.
Definition differentiating obstacle criterion is as follows
Under step environment, if there is barrier, the nose length that Hough transform is extracted can occur significantly to change, if t The a length of l of nose in moment picturet, definition differentiating obstacle criterion is as follows
It addition, under room environment, it is believed that barrier exists always.
If 4 detection of obstacles results are accessible, then realize miniature by the Flight Control Algorithm under varying environment type Unmanned plane indoor autonomous flight.
Under corridor environment, flight control information needed for MAV can be provided by end point position in the picture, If end point coordinate be (x, y), wimageWith himageThe width of difference representative image and height, definition Altitude control criterion is such as Under
Definition direction controlling criterion is as follows
Under step environment, extract the straight line in lower visible image b filtering out by Hough transform and represent the straight line of stepThe distance defining adjacent two straight lines is
Dis ( l i t , l i + 1 t ) = abs ( l i t . ρ - l i + 1 t . ρ ) , i ∈ [ 1 , n - 1 ]
Wherein ρ is the return value of Hough transform, and definition Altitude control criterion is
Threshold4 can set according to accuracy requirement.
When initially entering step environment due to MAV, course is random, as it is shown on figure 3, need to be adjusted To the direction vertical with step straight line, extract the straight line in lower visible image b by Hough transform and filter out and represent platform The a bunch of parallel lines on rank, these a bunch of parallel lines are exactly that unmanned plane course angle to adjust in the angle of Hough transformation parameter space Angle, it may be assumed that
Δ β=90-θhough
Δ β is the required course angle adjusted, θhoughAngle for a bunch of parallel lines that Hough transformation is given.
Step horizontal direction middle position should be remained at when MAV is along step flight, front view is entered as a Row image procossing obtains step straight line information, chooses wherein nose take midpoint, i.e.
x ‾ = ( x 1 + x 2 ) / 2
Definition direction controlling criterion is
If 5 detection of obstacles results are for there being obstacle, then realize MAV automatic obstacle avoiding by obstacle avoidance algorithm.
The present invention uses optical flow method to determine the distance of obstacle distance MAV, and this distance is fed back to controller Realize automatic obstacle avoiding.Psychology of vision think people and observed object generation relative motion time, observing object body surface is worn The movement at optical signature position provides motion and structural information.Relative motion time institute is had when between video camera and scene objects The luminance patterns motion referred to as light stream (optical flow) observed, the present invention uses block coupling and Lucas-Kanade Differential algorithm carries out the calculating of light stream.
Light stream can be by same pixel at the displacement d=(d of sequential frame imagex,dy) approximate expression, if I1(x, y t) are Reference map, in order to determine wherein certain pixel X1=(x1,y1) displacement, with (x1,y1Centered by), point intercepts υ × υ's Block Pυ, determine next frame image I by solving following functional minimum value2(x, y, t) in the block corresponding with this block:
SAD ( X 1 , d ) = Σ i = - υ υ Σ j = - υ υ | I 1 ( x 1 + i , y i + j , t ) - I 2 ( x 1 + i + d x , y 1 + j + d y , t + δt ) |
The light stream precision using block coupling to obtain is the highest, therefore uses Lucas-Kanade differential algorithm to calculate sub-pix Optical flow components.Following expression can be obtained by the constraint equation of light stream:
I 1 ( x , y , y ) = I 2 ( x + d n x + d s x , y + d n y + d s y , t + δt )
Obtained by block coupling, I2DeductNew images I' after being converted2, therefore above formula is writeable For
I 1 ( x , y , t ) = l 2 ′ ( x + d s x , y + d s y t + δt )
Carry out Taylor expansion, can obtain:
I x · d s x δt + I y · d s y δt + I t = 0
Can obtain by solving following formula minima in spatial neighborhood S:
Σ ( x , y ∈ S ) W 2 ( x , y ) [ ▿ I ( x , y , t ) · d s + I t ( x , y , t ) ] 2
Wherein (x, y) is weighting diagonal matrix to W, and the optimal solution using weighted least-squares method to obtain above formula is
ds=[ATW2A]-1ATW2b
Wherein
A=[I (x1,y1),...,▽I(xn,yn)]T
B=-[It(x1,y1),...,It(xn,yn)]T
Total light stream dm∈R2Equal to displacement dn∈Z2With Displacement ds∈R2Sum.
It addition, point (xi,yi) light stream at place) can be according to the speed of aircraft and angular velocity (Vx,Vy,Vzxyz) and degree of depth ZiRepresent:
x · i y · i = - 1 1 + βZ i 0 β 1 1 + βZ i 0 - 1 1 + βZ i β Z i 1 + βZ i V x V y V z + β x i y i - ( 1 β + β x i 2 ) y i ( 1 β + β x i 2 ) - β x i y i - x i Ω x Ω y Ω z
WhereinIt it is the inverse of focal length of camera.
Above formula is written as:
OF=OFt+OFr
Wherein OF is total light stream, OFtFor the translational component of light stream, OFrRotational component for light stream.
Therefore, draw rotational component according to the angular velocity that inertial navigation provides, utilize total light stream flat with what the difference of rotational component was asked Moving component, translational component only relies upon the degree of depth of barrier.Utilize translational component, according to expanded Kalman filtration algorithm Estimating the degree of depth of barrier, controller makes corresponding avoidance measure according to this depth information.

Claims (2)

1. MAV indoor based on a perspective image autonomous navigation method, it is characterised in that step is as follows:
Step 1: in the flight course of MAV indoor, Air-borne Forward-looking video camera Camera1 real-time image acquisition a, Airborne lower regarding video camera Camera2 real-time image acquisition b;
Step 2: analyze the see-through feature of image a and image b, it is judged that the environmental form residing for MAV, ring Type corridor, border, step or room;Process is as follows:
A, use Hough transform carry out straight-line detection to image a and obtain image c, it is judged that whether have disappearance in image c Point;Described end point is to have the point of see-through feature;
If B has end point, it is determined that indoor environment type is corridor, if without end point, then remove the water in image c Horizontal line and vertical line obtain image d, then judge whether have end point in image d;
If C has end point, judge that indoor environment type, as corridor, if without end point, then uses Hough transform equally Image b is carried out straight-line detection and obtains image e, it is judged that whether image e has the equidistant parallel lines of cluster;
If D has the equidistant parallel lines of cluster, it is determined that indoor environment type is step, if equidistantly putting down without cluster Line, it is determined that indoor environment type is room;
Step 3: for varying environment type, flight path barrier is detected by Use barriers detection algorithm, mistake Journey is as follows:
A, for corridor environment, the definition 50 × 50 brightness of image averages centered by end point as judgment standard, if Current frame image and previous frame image luminance mean value difference are less than threshold value Threshold2=30, it is believed that accessible, otherwise it is assumed that There is obstacle;
B, for step environment, a length of judgment standard of nose in definition image, if current frame image is with previous The length difference of two field picture nose is less than threshold value Threshold3=15, it is believed that accessible, otherwise it is assumed that there is obstacle;
Step 4: when testing result is accessible, then realize micro-by the flight control method under varying environment type Type unmanned plane indoor autonomous flight;
Step 5: if detection of obstacles result is for there being obstacle, then realize MAV by obstacle avoidance method autonomous Avoidance;
The flight control method under varying environment type described in step 4 is:
Situation 1, for corridor environment, definition end point coordinate in the picture isIfLess than picture altitude Half, then controller sends climb command to MAV, ifMore than the half of picture altitude, then controller MAV is sent decline instruction, otherwise keeps the most constant, ifLess than the half of picture traverse, then control Device sends flight directive to the left to MAV, ifMore than the half of picture traverse, then controller is to micro-unmanned Machine sends flight directive to the right, otherwise keeps direction constant;
Situation 2, for step environment, the distance defined between adjacent two parallel lines is dis, if current frame image Dis is more than threshold value Threshold4 with the dis difference of previous frame image, and this threshold value need to set according to accuracy requirement, and principle is Less than 1/10th of picture altitude, then controller sends climb command to MAV, if the dis of previous frame image Be more than threshold value Threshold4 with the dis difference of current frame image, then controller sends decline instruction to MAV, Otherwise keeping the most constant, the abscissa defining the longest straight line midpoint isIfLess than the half of picture traverse, then Controller sends flight directive to the left to MAV, ifMore than the half of picture traverse, then controller is to miniature Unmanned plane sends flight directive to the right, otherwise keeps direction constant.
MAV indoor based on perspective image autonomous navigation method the most according to claim 1, it is characterised in that:
Described in step 5, obstacle avoidance method is:
Step 1: use Lucas-Kanade algorithm to calculate the light stream of two continuous frames image;
Step 2: utilize expanded Kalman filtration algorithm to process light stream and the angular velocity of inertial navigation offer, eliminate light The rotational component of stream, accurately estimates translational component;
Step 3: utilize the translational component of light stream to recover the translational motion information of MAV and the distance letter of barrier Breath;
Step 4: according to the range information of barrier, controller sends the instruction of corresponding avoidance to MAV.
CN201410143275.0A 2014-04-10 2014-04-10 A kind of MAV indoor based on perspective image autonomous navigation method Active CN103925920B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410143275.0A CN103925920B (en) 2014-04-10 2014-04-10 A kind of MAV indoor based on perspective image autonomous navigation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410143275.0A CN103925920B (en) 2014-04-10 2014-04-10 A kind of MAV indoor based on perspective image autonomous navigation method

Publications (2)

Publication Number Publication Date
CN103925920A CN103925920A (en) 2014-07-16
CN103925920B true CN103925920B (en) 2016-08-17

Family

ID=51144221

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410143275.0A Active CN103925920B (en) 2014-04-10 2014-04-10 A kind of MAV indoor based on perspective image autonomous navigation method

Country Status (1)

Country Link
CN (1) CN103925920B (en)

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3754381A1 (en) 2013-12-10 2020-12-23 SZ DJI Technology Co., Ltd. Sensor fusion
KR101620580B1 (en) * 2014-09-04 2016-05-12 주식회사 에스원 Method and system for dectecting run
CN110174903B (en) 2014-09-05 2023-05-09 深圳市大疆创新科技有限公司 System and method for controlling a movable object within an environment
CN105517666B (en) * 2014-09-05 2019-08-27 深圳市大疆创新科技有限公司 Offline mode selection based on scene
WO2016033795A1 (en) 2014-09-05 2016-03-10 SZ DJI Technology Co., Ltd. Velocity control for an unmanned aerial vehicle
CN104236548B (en) * 2014-09-12 2017-04-05 清华大学 Autonomous navigation method in a kind of MAV room
CN104683773B (en) * 2015-03-25 2017-08-25 北京真德科技发展有限公司 UAV Video high speed transmission method
CN105000170B (en) * 2015-07-15 2017-11-28 珠海市磐石电子科技有限公司 The control method of touch screen controller and mobile devices
CN106909141A (en) * 2015-12-23 2017-06-30 北京机电工程研究所 Obstacle detection positioner and obstacle avoidance system
CN105606092B (en) * 2016-02-04 2019-02-15 中国科学院电子学研究所 A kind of Position Method for Indoor Robot and system
CN105759834B (en) * 2016-03-09 2018-07-24 中国科学院上海微系统与信息技术研究所 A kind of system and method actively capturing low latitude small-sized unmanned aircraft
CN106767719B (en) * 2016-12-28 2019-08-20 上海禾赛光电科技有限公司 The calculation method and gas remote measurement method of unmanned plane angle
CN107608384A (en) * 2017-10-13 2018-01-19 南京涵曦月自动化科技有限公司 A kind of unmanned plane barrier-avoiding method
CN107972446A (en) * 2017-11-22 2018-05-01 六六房车有限公司 Temperature environment automates adjusting method in a kind of caravan
CN107972561A (en) * 2017-11-22 2018-05-01 六六房车有限公司 A kind of room in-vehicle air environment Intelligentized regulating and controlling system
CN108427424B (en) * 2018-05-14 2023-10-27 珠海一微半导体股份有限公司 Obstacle detection device and method and mobile robot
CN108830257A (en) * 2018-06-29 2018-11-16 电子科技大学 A kind of potential obstacle detection method based on monocular light stream
CN109238288A (en) * 2018-09-10 2019-01-18 电子科技大学 Autonomous navigation method in a kind of unmanned plane room
CN111358364B (en) * 2018-12-26 2021-09-07 珠海市一微半导体有限公司 Dead angle cleaning method and device based on visual robot, chip and robot
IL268486B (en) * 2019-08-04 2020-08-31 Flyviz Indoor Ltd Autonomous aerial system and method
CN112050810B (en) * 2019-12-23 2022-09-27 华北电力大学(保定) Indoor positioning navigation method and system based on computer vision
CN111445491B (en) * 2020-03-24 2023-09-15 山东智翼航空科技有限公司 Three-neighborhood maximum difference edge detection narrow channel guiding method for miniature unmanned aerial vehicle
CN112180988B (en) * 2020-10-10 2024-03-19 广州海格星航信息科技有限公司 Route planning method and storage medium for three-dimensional outdoor space multi-rotor unmanned aerial vehicle
CN113065499B (en) * 2021-04-14 2022-07-01 湖南大学 Air robot cluster control method and system based on visual learning drive
CN116048120B (en) * 2023-01-10 2024-04-16 中国建筑一局(集团)有限公司 Autonomous navigation system and method for small four-rotor unmanned aerial vehicle in unknown dynamic environment

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102955478A (en) * 2012-10-24 2013-03-06 深圳一电科技有限公司 Unmanned aerial vehicle flying control method and unmanned aerial vehicle flying control system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102955478A (en) * 2012-10-24 2013-03-06 深圳一电科技有限公司 Unmanned aerial vehicle flying control method and unmanned aerial vehicle flying control system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
"Autonomous MAV Flight in Indoor Environments using";Cooper Bills;《2011 IEEE International Conference on Robotics and Automation》;20110513;正文第5776-5783页 *
"MAV Navigation through Indoor Corridors Using Optical Flow";Simon Zingg;《2010 IEEE International Conference on Robotics and Automation》;20100508;正文第3361-3368页 *
"Vision-Based Guidance and Control of a Hovering Vehicle in Unknown, GPS-denied Environments";Spencer Ahrens;《2009 IEEE International Conference on Robotics and Automation》;20090517;正文第2643-2648页 *
"依靠自身传感器的室内无人机自主导航引导技术综述";倪磊等;《计算机应用与软件》;20120831;第29卷(第8期);正文第160-163页 *

Also Published As

Publication number Publication date
CN103925920A (en) 2014-07-16

Similar Documents

Publication Publication Date Title
CN103925920B (en) A kind of MAV indoor based on perspective image autonomous navigation method
CN106681353B (en) The unmanned plane barrier-avoiding method and system merged based on binocular vision with light stream
US11481024B2 (en) Six degree of freedom tracking with scale recovery and obstacle avoidance
US11218689B2 (en) Methods and systems for selective sensor fusion
CN105946853B (en) The system and method for long range automatic parking based on Multi-sensor Fusion
CN106441286B (en) Unmanned plane tunnel cruising inspection system based on BIM technology
EP2209091B1 (en) System and method for object motion detection based on multiple 3D warping and vehicle equipped with such system
Strydom et al. Visual odometry: autonomous uav navigation using optic flow and stereo
Shen et al. Vision-Based State Estimation and Trajectory Control Towards High-Speed Flight with a Quadrotor.
CN105644785B (en) A kind of UAV Landing method detected based on optical flow method and horizon
CN104880187B (en) A kind of method for estimating of the aircraft light stream pick-up unit based on twin camera
US8005257B2 (en) Gesture recognition apparatus and method
Krombach et al. Feature-based visual odometry prior for real-time semi-dense stereo SLAM
Williams et al. Feature and pose constrained visual aided inertial navigation for computationally constrained aerial vehicles
McManus et al. Distraction suppression for vision-based pose estimation at city scales
CN111489392A (en) Single target human motion posture capturing method and system in multi-person environment
Zhang et al. Vision-based relative altitude estimation of small unmanned aerial vehicles in target localization
KR101319526B1 (en) Method for providing location information of target using mobile robot
CN112945233B (en) Global drift-free autonomous robot simultaneous positioning and map construction method
Daftry et al. Semi-dense visual odometry for monocular navigation in cluttered environment
CN111157008B (en) Local autonomous navigation system and method based on multidimensional environment information perception
von Stumberg et al. Autonomous exploration with a low-cost quadrocopter using semi-dense monocular slam
Trisiripisal et al. Stereo analysis for vision-based guidance and control of aircraft landing
CN109933092B (en) Aircraft obstacle avoidance method and device, readable storage medium and aircraft
Liu et al. Binocular vision-based autonomous path planning for UAVs in unknown outdoor scenes

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20190729

Address after: Room 404, Material Building, Northwest Polytechnic University, 127 Youyi West Road, Xi'an City, Shaanxi Province, 710072

Patentee after: Northwestern Polytechnical University Asset Management Co.,Ltd.

Address before: 710072 Xi'an friendship West Road, Shaanxi, No. 127

Patentee before: Northwestern Polytechnical University

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20191115

Address after: 710072 floor 19, building B, innovation and technology building, northwest Polytechnic University, No.127, Youyi West Road, Beilin District, Xi'an, Shaanxi Province

Patentee after: Shaanxi CISCO Rudi Network Security Technology Co.,Ltd.

Address before: Room 404, Material Building, Northwest Polytechnic University, 127 Youyi West Road, Xi'an City, Shaanxi Province, 710072

Patentee before: Xi'an Northwestern Polytechnical University Asset Management Co.,Ltd.

TR01 Transfer of patent right
CP01 Change in the name or title of a patent holder

Address after: Floor 19, block B, innovation and technology building, Northwest University of technology, 127 Youyi West Road, Beilin District, Xi'an City, Shaanxi Province, 710072

Patentee after: Shaanxi University of technology Ruidi Information Technology Co.,Ltd.

Address before: Floor 19, block B, innovation and technology building, Northwest University of technology, 127 Youyi West Road, Beilin District, Xi'an City, Shaanxi Province, 710072

Patentee before: Shaanxi CISCO Rudi Network Security Technology Co.,Ltd.

CP01 Change in the name or title of a patent holder
TR01 Transfer of patent right

Effective date of registration: 20231012

Address after: 518000 Unit 204, Xingyuanju 2, Xilihu Road, Xili Street, Nanshan District, Shenzhen, Guangdong Province

Patentee after: Shenzhen Onoan Technology Co.,Ltd.

Address before: Floor 19, block B, innovation and technology building, Northwest University of technology, 127 Youyi West Road, Beilin District, Xi'an City, Shaanxi Province, 710072

Patentee before: Shaanxi University of technology Ruidi Information Technology Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20231218

Address after: 710000, No. 581, East Zone, National E-commerce Demonstration Base, No. 528 Tianguba Road, Software New City, High tech Zone, Xi'an City, Shaanxi Province

Patentee after: Xi'an Chenxiang Zhuoyue Technology Co.,Ltd.

Address before: 518000 Unit 204, Xingyuanju 2, Xilihu Road, Xili Street, Nanshan District, Shenzhen, Guangdong Province

Patentee before: Shenzhen Onoan Technology Co.,Ltd.

TR01 Transfer of patent right