CN106681353B - The unmanned plane barrier-avoiding method and system merged based on binocular vision with light stream - Google Patents
The unmanned plane barrier-avoiding method and system merged based on binocular vision with light stream Download PDFInfo
- Publication number
- CN106681353B CN106681353B CN201611069481.7A CN201611069481A CN106681353B CN 106681353 B CN106681353 B CN 106681353B CN 201611069481 A CN201611069481 A CN 201611069481A CN 106681353 B CN106681353 B CN 106681353B
- Authority
- CN
- China
- Prior art keywords
- barrier
- depth
- unmanned plane
- information
- value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/0094—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
Abstract
The invention discloses the unmanned plane barrier-avoiding method and system that are merged based on binocular vision with light stream, this method obtains image information by airborne binocular camera in real time;Image depth information is obtained using graphics processor GPU;Using the geometric profile information of the most threatening barrier of the extraction of depth information of acquisition and distance is threatened by threatening depth model to calculate it;Window is tracked by the rectangle fitting acquired disturbance object to barrier geometric profile information and calculates speed of the optical flow field of barrier affiliated area with acquired disturbance object relative to unmanned plane;Flight control computer evades flare maneuver instruction according to the sending of calculated obstacle distance information, geometric profile information and relative velocity with avoiding barrier.The depth information of barrier and light stream vector are carried out effective integration by the present invention, and motion information of the real-time acquired disturbance object relative to unmanned plane improves the ability of unmanned plane fast vision avoidance, and real-time, accuracy have biggish promotion compared to traditional algorithm.
Description
Technical field
The present invention relates to unmanned plane barrier-avoiding methods, more particularly to the unmanned plane avoidance merged based on binocular vision with light stream
Method and system belong to unmanned plane avoidance technical field.
Background technique
With the development of unmanned air vehicle technique and its application market, unmanned plane is often faced with different from the past special
Business, these tasks quickly identify to it, hide the ability of barrier on traveling air route, and more stringent requirements are proposed.View-based access control model
Obstacle avoidance system the spies such as possesses simple equipment, at low cost, good economy performance, has a wide range of application because generally use passive working method
Point.
Compared to the obstacle avoidance system based on active sensors such as ultrasonic wave, laser radars, vision obstacle avoidance system response speed
Faster, precision is higher, can provide such as color, texture, geometry information more abundant, therefore obtained more and more
Concern.
Binocular vision can obtain the range information vertical with camera compared with monocular vision, significantly more efficient can sentence
The relative position of disconnected barrier and unmanned plane out, it helps be fast and accurately split barrier from complex background;
Binocular vision has been widely used in the multiple fields such as robot navigation, target following at present.
Optical flow method is a kind of method moved using the correlation of pixel intensity data in image sequence come prediction pixel point,
Study the variation of brightness of image in time to establish the sports ground of object pixel point set;Under normal circumstances, light stream can be with
The associated movement of target movement or both in camera motion, scene effectively accurately measure, premeasuring can be with table
Show the instantaneous velocity of target movement.
Usually there are sparse optical flow method and two kinds of dense optical flow method to the calculation method of optical flow field.Sparse optical flow chooses some figures
Characteristic point in image field scape, by the speed for being fitted entire sports ground to the measurement of these feature spot speed.Dense optical flow is then
It is the sports ground for calculating whole region, movement velocity of the target area relative to camera is obtained with this;Sparse optical flow operation speed
Degree is fast, but calculated value error is big;Although dense optical flow calculating is more accurate, if not carrying out essence to the target area to be calculated
Really segmentation can then greatly increase and calculate the time, so being often required to cooperation, quickly accurate image segmentation algorithm is used.
Application No. is " the vehicle movement information inspections merged based on binocular stereo vision with light stream of CN201410565278.3
Survey method ", point-of-interest on ground is mainly marked by binocular vision, then calculate the light stream value of point-of-interest, finally again
With the D translation speed and three-dimensional rotation speed of least square fitting estimation surface car.This method uses the speed of characteristic point
Information replaces the velocity information of vehicle, although arithmetic speed is enhanced, estimated accuracy is difficult to ensure.And this method is pair
The motion information of vehicle itself is estimated, can not identify and hide the barrier in traveling process, it is difficult to obtain in practical field
To utilization.
Application No. is a kind of " inspection prober automatic obstacle avoiding rule based on binocular stereo vision of CN201110412394.8
The method of drawing ", mainly the three-dimensional coordinate of all pixels point in camera image is calculated by binocular stereo vision and forms sensing point
Three-dimensional map, the optimal path of an avoiding obstacles is selected according to three-dimensional map.This method needs to calculate in visual field
The three-dimensional coordinate of whole pixels has very high requirement to the capacity of processor operational performance and memory, be not suitable for it is small
The embedded airborne equipment of type.
Application be CN201510688485.2 " a kind of autonomous obstacle detection system of unmanned plane based on binocular vision and
Method ", the hardware structure for carrying out detection of obstacles using binocular camera has been highlighted, has mainly used FPGA as processing
The arithmetic core of binocular image.Although FPGA has the features such as small in size, arithmetic speed is fast, but FPGA is expensive and needs to make
Programming is carried out with special development language, is unfavorable for being docked with other modules.And the patent only illustrates unmanned plane
The hardware structure that detection of obstacles is carried out using binocular vision, does not illustrate the specific algorithm for how detecting barrier.
Therefore, although having more research in the field for carrying out avoidance using binocular vision both at home and abroad, big multi-method without
Method is quickly obtained position and speed of the barrier relative to unmanned plane, it is difficult to rapidly and accurately barrier hidden, thus it is big
It is difficult to apply to unmanned plane Real Time Obstacle Avoiding field more.
Summary of the invention
The technical problems to be solved by the present invention are: providing the unmanned plane barrier-avoiding method merged based on binocular vision with light stream
And system, the depth information of barrier and light stream vector are subjected to effective integration, real-time acquired disturbance object is relative to unmanned plane
Motion information realizes the real-time Obstacle avoidance of unmanned plane.
The present invention uses following technical scheme to solve above-mentioned technical problem:
Based on the unmanned plane barrier-avoiding method that binocular vision is merged with light stream, include the following steps:
Step 1, the image in unmanned plane direction of advance is obtained using binocular camera, and greyscale transformation is done to image;
Step 2, it calculates after greyscale transformation the characteristic information of each pixel on image and carries out Stereo matching, obtain unmanned plane
Depth map information in direction of advance;
Step 3, the depth value in depth map is divided into two class of depth value for belonging to barrier or belonging to background, by depth map
It is divided into barrier region and background area, using the maximum profile of closed area in barrier region as barrier profile, is used in combination
Rectangle frame is fitted it, obtains geological information of the barrier tracking window as barrier;
Step 4, the velocity vector that barrier tracking window sliding is calculated using dense optical flow method, obtains the window in x, y
Speed on direction and the position that window is tracked according to velocity estimated next frame image barrier, by the position of judgement and next frame
Practical calculated barrier is tracked the window's position and is compared, if difference between the two is less than threshold value, carries out step 5,
Otherwise, return step 1 recalculates;
Step 5, barrier is calculated for the threat depth value of unmanned plane using depth threat modeling;
Step 6, by the calculated barrier geological information of step 3 and the calculated barrier velocity information of step 4 from picture
Plain coordinate transformation is world coordinates, and is corrected using the kinematic parameter of unmanned plane;
Step 7, geometry that obstacle position information and step 6 obtain, velocity information are sent to the winged control meter of unmanned plane
In calculation machine, flight control computer controls unmanned plane according to above- mentioned information and makes real-time avoiding action.
As a kind of preferred embodiment of the method for the present invention, unmanned plane advance side is obtained using binocular camera described in step 1
The detailed process of upward image are as follows: binocular camera is installed on to the head position of unmanned plane, is obtained by scaling method double
The inside and outside parameter matrix and distortion parameter of lens camera obtain the image in unmanned plane direction of advance using binocular camera, and
Image is corrected according to inside and outside parameter matrix and distortion parameter, obtains undistorted and row alignment two images.
As a kind of preferred embodiment of the method for the present invention, the detailed process of the step 2 are as follows: calculate image after greyscale transformation
Upper each pixel upper and lower, left and right, upper left, lower-left, upper right, 8 directions in bottom right energy function and add up, seek parallax value
So that the energy function after cumulative minimizes, the depth value information of each pixel is determined according to parallax value.
As a kind of preferred embodiment of the method for the present invention, by the maximum wheel of closed area in barrier region described in step 3
Exterior feature as before barrier profile, using speckle filter to distinguished the depth map of barrier region and background area into
Row filtering, removes noise.
As a kind of preferred embodiment of the method for the present invention, the depth value in depth map is divided into described in step 3 and belongs to obstacle
Object or belong to background two class of depth value detailed process are as follows:
Set segmentation threshold Dh, depth value is more than or equal to DhBe classified as the depth value for belonging to barrier, depth value is small
In DhBe classified as the depth value for belonging to background, pass through the square solution D maximized between barrier and backgroundh, varianceMeter
Calculate formula are as follows:
Wherein, ω0、ω1Respectively depth value is by DhIt is divided into the probability of barrier, background depth value, μ0、μ1Respectively belong to
In the mean value of barrier, the depth value of background:
Wherein, DiFor discrete credible depth layer, i=1 ..., t, t is the number of depth layer, D1,…,DhTo belong to obstacle
The depth layer of object, Dh+1,…,DtFor the depth layer for belonging to background, KjFor the number of depth value in each depth layer, j=1 ..., t.
As a kind of preferred embodiment of the method for the present invention, the detailed process of the step 5 are as follows:
The depth value set for belonging to barrier is set as DK={ d1,d2,…,dK, d1,d2,…,dKIt is depth value, K is
Belong to the number of the depth value of barrier;D1,…,DtFor discrete credible depth layer, t is the number of depth layer, K1,…,KtFor
The number of depth value in each depth layer, if it exists at least one1≤j≤t, then threat of the barrier for unmanned plane
Depth value are as follows:Wherein, DminFor greater thanKjIt is minimum in corresponding depth layer
Depth layer, KminFor DminThe number of middle depth value;If all K1,…,KtRespectively less thanThen prestige of the barrier for unmanned plane
Coerce depth value are as follows:
Based on the unmanned plane obstacle avoidance system that binocular vision is merged with light stream, including the Image Acquisition mould being mounted on unmanned plane
Block, image processing module, inertia measuring module, GNSS module, unmanned plane include flight control computer;Described image acquisition module obtains
The image synchronization in unmanned plane direction of advance is taken to be passed to image processing module, image processing module includes CPU module and GPU mould
Block, calculates separately geometry, speed, the location information of barrier, and GNSS module, inertia measuring module respectively carry out unmanned plane real
Shi Dingwei and attitude measurement, inertia measuring module are also corrected the calculated velocity information of image processing module, fly control meter
Calculation machine merged according to the information that image processing module is sent and control unmanned plane make avoid before it is dynamic to evading for barrier
Make.
As a kind of preferred embodiment of present system, which further includes ultrasonic wave module, and ultrasonic wave module includes four
A ultrasonic sensor is respectively arranged on the front, rear, left and right four direction of unmanned plane, for detecting the obstacle of four direction
Object.
The invention adopts the above technical scheme compared with prior art, has following technical effect that
1, the present invention effectively divides barrier by binocular vision, only calculates the light stream in barrier tracking window
Value solves the problems, such as that time-consuming when dense optical flow calculates entire image, further improves the real-time of obstacle avoidance algorithm.
2, the invention proposes a kind of new threat depth model, which can simplify avoidance process, it is not necessary to introduce excessively
Complicated path planning algorithm, practical value with higher.
3, the present invention is calculated Stereo matching and light stream using GPU, utilized by being calculated simultaneously using GPU and CPU
CPU calculates barrier geometric dimension, position and speed, improves the arithmetic speed of algorithm.
Detailed description of the invention
Fig. 1 is the chessboard trrellis diagram that camera is demarcated in the present invention.
Fig. 2 is the hardware structure diagram of the unmanned plane obstacle avoidance system merged the present invention is based on binocular vision with light stream.
Fig. 3 is the algorithm flow chart of the unmanned plane barrier-avoiding method merged the present invention is based on binocular vision with light stream.
Fig. 4 is the Robot dodge strategy schematic diagram that the continuous barrier of depth is directed in the embodiment of the present invention.
Specific embodiment
Embodiments of the present invention are described below in detail, the example of the embodiment is shown in the accompanying drawings.Below by
The embodiment being described with reference to the drawings is exemplary, and for explaining only the invention, and is not construed as limiting the claims.
As shown in Fig. 2, for the hardware structure diagram of the unmanned plane obstacle avoidance system merged the present invention is based on binocular vision with light stream,
System includes image capture module, embedded image processing module, flight control computer, ultrasonic wave module, GNSS (Global
Navigation Satellite System Global Satellite Navigation System) module and inertia measuring module, wherein embedded figure
As processing module includes CPU module and CPU module, ultrasonic wave module includes four ultrasonic sensors, before unmanned plane,
Afterwards, mounting ultrasonic sensor is as obstacle avoidance aiding device on left and right four direction, when ultrasonic sensor sensor detects
There are when barrier, flight control computer adjusts avoidance according to obstacle distance information and acts in other directions.
Left images are synchronized incoming embedded image processing module by image capture module;CPU module and CPU module are responsible for
Calculate the avoidance parameter of unmanned plane;GNSS module and inertia measuring module are responsible for real-time positioning and the attitude measurement of unmanned plane, this
Outside, inertia measuring module is also responsible for measurement result being sent to embedded image processing module to calculated speed by serial ports
Degree information is corrected;Flight control computer carries out information transmitted by embedded image processing module and ultrasonic wave module effective
It merges and controls unmanned plane and make the avoiding action avoided Qian to barrier.
Image capture module uses binocular camera in this example, and resolution ratio 640*480 or 800*600 etc. can use frame per second
20 arrive 30fps, and parallax range 12cm, parallax range is adjustable with focal length, can be by USB or other high-speed interfaces directly by two-way
Synchronous vision signal is sent into embedded image processing module.
As shown in figure 3, the algorithm flow of the unmanned plane barrier-avoiding method merged with light stream the present invention is based on binocular vision is as follows:
Binocular camera is demarcated using gridiron pattern as shown in Figure 1 first, respectively obtains the Intrinsic Matrix of two video cameras
Outer parameter (including spin matrix and translation vector) between distortion parameter, two video cameras is simultaneously stored in embedded figure
As processing module memory in.
The synchronizing video data that binocular camera is sent is read in, using Intrinsic Matrix, distortion parameter and outer parameter by two
Width image is corrected, and keeps two images undistorted and row alignment;Two images are transformed into gray space and utilize CPU module
Calculate each pixel upper and lower, left and right simultaneously, upper left, lower-left, upper right, 8 directions in bottom right energy function and accumulate it,
The depth information of each pixel is determined (i.e. by calculating the parallax value that each pixel minimizes energy function simultaneously
Depth map).CPU module obtains barrier tracking window according to the calculated depth information of CPU module, determines the several of barrier
What, location information, and determine that barrier tracks window;CPU module tracks window parallel computation window further according to selected barrier
To calculate relative velocity, velocity information is corrected and believes with position the light stream value of all pixels point by last CPU module in mouthful
Breath, geometric dimension are sent to flight control computer together and are handled.
Depth value in depth map is divided into two classes, one kind is the depth value D for belonging to barrierK={ d1,d2,…,dK,
One kind is the depth value for belonging to backgroundBy maximizing DKWith DbBetween variance find segmentation threshold
Value Dh, and D will be less thanhDepth value be set to 0, the profile after calculation processing in depth map simultaneously choose largest contours carry out rectangle
Fitting obtains barrier rectangle tracking window.
Variance between two class depth valuesIt may be expressed as:
ω in formula0And ω1For by DhThe probability of two class depth values of segmentation, μ0And μ1For two class depth mean values:
Wherein, { D1,…,Dt}=DrFor discrete credible depth layer, KjFor the number of depth value in each depth layer, t is deep
Spend the number of layer.
It extracts the depth map distinguished by barrier, background and calculates the profile in depth map;Carry out contour detecting it
Before, first depth map is filtered using speckle filter, removes lesser blocky depth areas;The closure that will test out
The maximum profile of area is fitted it as barrier profile, and with rectangle frame, using the rectangle fitted as barrier
Track window.
Barrier is calculated to the threat depth D of unmanned plane0, it is assumed that DK={ d1,d2,…,dKBe one group and belong to barrier
Depth value set, d1,d2,…,dKIt is the depth value for belonging to barrier, K is the number of these depth values, DrFor believable depth
Section, D1,D2,…,DtFor discrete credible depth layer on this section, { D1,D2,…,Dt}=Dr, K1,…,KtFor each depth
Spend layer D1,D2,…,DtThe number of middle depth value, as 1≤j≤t, if there isThe threat of barrier tracking window
Depth D0Are as follows:
In formula, DminFor greater thanKjMinimum-depth layer, K in corresponding depth layerminFor DminThe number of middle depth value;
If allThe threat depth D of barrier tracking window0Are as follows:
The optical flow field in barrier tracking window is calculated using Horn-Schunck dense optical flow method, tracking window is obtained and exists
Speed on the direction x, y and the position that window is tracked according to velocity estimated next frame;By the position of judgement and the practical meter of next frame
Tracking the window's position of calculating is compared, if less than 10 pixels of difference between the two, it is accurate to be judged to calculating, if
Greater than 10 pixels then determine to calculate mistake and return to the first step to recalculate.
Calculated barrier is tracked into barrier geological information and barrier represented by window using inside and outside parameter matrix
Object velocity information is hindered to be converted into world coordinates, and the speed using following formula to barrier relative to unmanned plane from pixel coordinate
It is corrected:
In formula, vxAnd vyFor the direction unmanned plane x, the y speed after correction;fxAnd fyFocal length for x, on the direction y;U and v are
Calculated barrier x in step 4, the direction y light stream vector;D0For the threat depth of barrier tracking window;θ,Respectively
The pitching of unmanned plane, yaw angle;Time of the Δ t between two frames.
By calculated speed, geometry, location information and depth information is threatened to be sent to UAV Flight Control System, nothing
Man-machine flight control system controls unmanned plane according to the information and makes real-time avoiding action to avoid the barrier on flight path,
Location information is that barrier tracks deviant of the window center relative to picture centre.
As shown in figure 4, multiple barriers are broken down into, in different frame image when hiding the continuous barrier of depth
In calculate separately barrier each section threat depth carry out avoidance.
The present invention obtains depth map information by the Stereo Matching Algorithm accelerated by GPU, then the segmentation that breaks the barriers, prestige
Side of body depth calculation, optical flow computation and etc. the geometric dimension of acquired disturbance object, with the relative position of unmanned plane, speed and sent out
It send to flight control computer and generates avoidance movement.The algorithm improves the fortune of algorithm by being calculated simultaneously using GPU and CPU
Speed is calculated, barrier is effectively divided by binocular vision, the light stream value in barrier tracking window is only calculated, solves
The problem of time-consuming when dense optical flow calculating entire image, further improves the real-time of obstacle avoidance algorithm;It is proposed by the present invention
Threaten depth model that can simplify avoidance process, it is not necessary to introduce excessively complicated path planning algorithm, practical value with higher.
The above examples only illustrate the technical idea of the present invention, and this does not limit the scope of protection of the present invention, all
According to the technical idea provided by the invention, any changes made on the basis of the technical scheme each falls within the scope of the present invention
Within.
Claims (5)
1. the unmanned plane barrier-avoiding method merged based on binocular vision with light stream, which comprises the steps of:
Step 1, the image in unmanned plane direction of advance is obtained using binocular camera, and greyscale transformation is done to image;
Step 2, it calculates after greyscale transformation the characteristic information of each pixel on image and carries out Stereo matching, obtain unmanned plane advance
Depth map information on direction;
Step 3, the depth value in depth map is divided into two class of depth value for belonging to barrier or belonging to background, depth map is divided into
Rectangle using the maximum profile of closed area in barrier region as barrier profile, and is used in barrier region and background area
Frame is fitted it, obtains geological information of the barrier tracking window as barrier;
Step 4, the velocity vector that barrier tracking window sliding is calculated using dense optical flow method, obtains the window in x, the direction y
On speed and the position of window is tracked according to velocity estimated next frame image barrier, the position of judgement and next frame is practical
Calculated barrier is tracked the window's position and is compared, if difference between the two is less than threshold value, carries out step 5, otherwise,
Return step 1 recalculates;
Step 5, barrier is calculated for the threat depth value of unmanned plane using depth threat modeling;Detailed process are as follows:
The depth value set for belonging to barrier is set as DK={ d1,d2,…,dK, d1,d2,…,dKIt is depth value, K is to belong to
The number of the depth value of barrier;D1,…,DtFor discrete credible depth layer, t is the number of depth layer, K1,…,KtFor each depth
Spend layer in depth value number, if it exists at least oneThen threat depth of the barrier for unmanned plane
Value are as follows:Wherein, DminFor greater thanKjMinimum-depth in corresponding depth layer
Layer, KminFor DminThe number of middle depth value;If all K1,…,KtRespectively less thanThen barrier is deep for the threat of unmanned plane
Angle value are as follows:
Step 6, the calculated barrier geological information of step 3 and the calculated barrier velocity information of step 4 are sat from pixel
Mark is converted into world coordinates, and is corrected using the kinematic parameter of unmanned plane;
Step 7, geometry that obstacle position information and step 6 obtain, velocity information are sent to the flight control computer of unmanned plane
In, flight control computer controls unmanned plane according to above- mentioned information and makes real-time avoiding action.
2. the unmanned plane barrier-avoiding method merged according to claim 1 based on binocular vision with light stream, which is characterized in that step
1 detailed process that the image in unmanned plane direction of advance is obtained using binocular camera are as follows: be installed on binocular camera
The head position of unmanned plane is obtained the inside and outside parameter matrix and distortion parameter of binocular camera by scaling method, utilizes binocular
Video camera obtains the image in unmanned plane direction of advance, and is corrected according to inside and outside parameter matrix and distortion parameter to image,
Obtain undistorted and row alignment two images.
3. the unmanned plane barrier-avoiding method merged according to claim 1 based on binocular vision with light stream, which is characterized in that described
The detailed process of step 2 are as follows: calculate after greyscale transformation each pixel upper and lower, left and right, upper left, lower-left, upper right, bottom right on image
The energy function in 8 directions simultaneously adds up, and asks parallax value that the energy function after adding up is minimized, is determined according to parallax value
The depth value information of each pixel.
4. the unmanned plane barrier-avoiding method merged according to claim 1 based on binocular vision with light stream, which is characterized in that step
3 it is described using the maximum profile of closed area in barrier region as barrier profile before, using speckle filter to area
Divide the depth map of barrier region and background area to be filtered, removes noise.
5. the unmanned plane barrier-avoiding method merged according to claim 1 based on binocular vision with light stream, which is characterized in that step
3 depth values by depth map, which are divided into, belongs to barrier or the detailed process of two class of depth value that belongs to background are as follows:
Set segmentation threshold Dh, depth value is more than or equal to DhBe classified as the depth value for belonging to barrier, by depth value be less than Dh
Be classified as the depth value for belonging to background, pass through the square solution D maximized between barrier and backgroundh, varianceIt calculates public
Formula are as follows:
Wherein, ω0、ω1Respectively depth value is by DhIt is divided into the probability of barrier, background depth value, μ0、μ1Respectively belong to barrier
Hinder object, background depth value mean value:
Wherein, DiFor discrete credible depth layer, i=1 ..., t, t is the number of depth layer, D1,…,DhTo belong to barrier
Depth layer, Dh+1,…,DtFor the depth layer for belonging to background, KjFor the number of depth value in each depth layer, j=1 ..., t.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611069481.7A CN106681353B (en) | 2016-11-29 | 2016-11-29 | The unmanned plane barrier-avoiding method and system merged based on binocular vision with light stream |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611069481.7A CN106681353B (en) | 2016-11-29 | 2016-11-29 | The unmanned plane barrier-avoiding method and system merged based on binocular vision with light stream |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106681353A CN106681353A (en) | 2017-05-17 |
CN106681353B true CN106681353B (en) | 2019-10-25 |
Family
ID=58866816
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611069481.7A Expired - Fee Related CN106681353B (en) | 2016-11-29 | 2016-11-29 | The unmanned plane barrier-avoiding method and system merged based on binocular vision with light stream |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106681353B (en) |
Families Citing this family (49)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107091643A (en) * | 2017-06-07 | 2017-08-25 | 旗瀚科技有限公司 | A kind of indoor navigation method based on many 3D structure lights camera splicings |
CN109214984B (en) * | 2017-07-03 | 2023-03-14 | 臻迪科技股份有限公司 | Image acquisition method and device, autonomous positioning navigation system and computing equipment |
CN107689063A (en) * | 2017-07-27 | 2018-02-13 | 南京理工大学北方研究院 | A kind of robot indoor orientation method based on ceiling image |
CN107388967B (en) * | 2017-08-14 | 2019-11-12 | 上海汽车集团股份有限公司 | A kind of outer parameter compensation method of vehicle-mounted three-dimensional laser sensor and device |
CN108007474A (en) * | 2017-08-31 | 2018-05-08 | 哈尔滨工业大学 | A kind of unmanned vehicle independent positioning and pose alignment technique based on land marking |
CN108106623B (en) * | 2017-09-08 | 2021-06-04 | 同济大学 | Unmanned vehicle path planning method based on flow field |
CN107497621B (en) * | 2017-09-20 | 2018-04-06 | 安徽灵感科技有限公司 | Extended pattern is atomized regulating system and method |
CN107908195B (en) * | 2017-11-06 | 2021-09-21 | 深圳市道通智能航空技术股份有限公司 | Target tracking method, target tracking device, tracker and computer-readable storage medium |
CN107909614B (en) * | 2017-11-13 | 2021-02-26 | 中国矿业大学 | Positioning method of inspection robot in GPS failure environment |
CN108058838A (en) * | 2017-12-03 | 2018-05-22 | 中国直升机设计研究所 | A kind of helicopter collision avoidance system based on binocular distance measurement |
CN108053691A (en) * | 2017-12-19 | 2018-05-18 | 广东省航空航天装备技术研究所 | A kind of unmanned plane of unmanned plane anticollision automatic testing method and application this method |
CN108280401B (en) * | 2017-12-27 | 2020-04-07 | 达闼科技(北京)有限公司 | Pavement detection method and device, cloud server and computer program product |
CN108230403A (en) * | 2018-01-23 | 2018-06-29 | 北京易智能科技有限公司 | A kind of obstacle detection method based on space segmentation |
CN108082506A (en) * | 2018-01-26 | 2018-05-29 | 锐合防务技术(北京)有限公司 | Unmanned vehicle |
CN108536171B (en) * | 2018-03-21 | 2020-12-29 | 电子科技大学 | Path planning method for collaborative tracking of multiple unmanned aerial vehicles under multiple constraints |
CN108445905A (en) * | 2018-03-30 | 2018-08-24 | 合肥赛为智能有限公司 | A kind of UAV Intelligent avoidance regulator control system |
CN110488805A (en) * | 2018-05-15 | 2019-11-22 | 武汉小狮科技有限公司 | A kind of unmanned vehicle obstacle avoidance system and method based on 3D stereoscopic vision |
CN108873931A (en) * | 2018-06-05 | 2018-11-23 | 北京理工雷科电子信息技术有限公司 | A kind of unmanned plane vision avoiding collision combined based on subjectiveness and objectiveness |
CN109520497B (en) * | 2018-10-19 | 2022-09-30 | 天津大学 | Unmanned aerial vehicle autonomous positioning method based on vision and imu |
CN111113404B (en) * | 2018-11-01 | 2023-07-04 | 阿里巴巴集团控股有限公司 | Method for robot to obtain position service and robot |
CN112912811B (en) * | 2018-11-21 | 2024-03-29 | 深圳市道通智能航空技术股份有限公司 | Unmanned aerial vehicle path planning method and device and unmanned aerial vehicle |
CN109753081B (en) * | 2018-12-14 | 2020-08-21 | 煤炭科学研究总院 | Roadway inspection unmanned aerial vehicle system based on machine vision and navigation method |
CN111354027A (en) * | 2018-12-21 | 2020-06-30 | 沈阳新松机器人自动化股份有限公司 | Visual obstacle avoidance method for mobile robot |
CN109947093A (en) * | 2019-01-24 | 2019-06-28 | 广东工业大学 | A kind of intelligent barrier avoiding algorithm based on binocular vision |
CN110007313A (en) * | 2019-03-08 | 2019-07-12 | 中国科学院深圳先进技术研究院 | Obstacle detection method and device based on unmanned plane |
WO2020215194A1 (en) * | 2019-04-23 | 2020-10-29 | 深圳市大疆创新科技有限公司 | Method and system for detecting moving target object, and movable platform |
CN110222581B (en) * | 2019-05-13 | 2022-04-19 | 电子科技大学 | Binocular camera-based quad-rotor unmanned aerial vehicle visual target tracking method |
CN110209184A (en) * | 2019-06-21 | 2019-09-06 | 太原理工大学 | A kind of unmanned plane barrier-avoiding method based on binocular vision system |
CN110299030B (en) * | 2019-06-28 | 2021-11-19 | 汉王科技股份有限公司 | Handheld terminal, aircraft, airspace measurement method and control method of aircraft |
CN110543186A (en) * | 2019-08-02 | 2019-12-06 | 佛山科学技术学院 | forest fire monitoring system and method based on unmanned aerial vehicle and storage medium |
CN110647156B (en) * | 2019-09-17 | 2021-05-11 | 中国科学院自动化研究所 | Target object docking ring-based docking equipment pose adjusting method and system |
CN110568861B (en) * | 2019-09-19 | 2022-09-16 | 中国电子科技集团公司电子科学研究院 | Man-machine movement obstacle monitoring method, readable storage medium and unmanned machine |
CN110673647B (en) * | 2019-11-07 | 2022-05-03 | 深圳市道通智能航空技术股份有限公司 | Omnidirectional obstacle avoidance method and unmanned aerial vehicle |
CN111428651B (en) * | 2020-03-26 | 2023-05-16 | 广州小鹏汽车科技有限公司 | Obstacle information acquisition method and system for vehicle and vehicle |
CN111619556B (en) * | 2020-05-22 | 2022-05-03 | 奇瑞汽车股份有限公司 | Obstacle avoidance control method and device for automobile and storage medium |
CN111736631B (en) * | 2020-07-09 | 2023-03-21 | 纪关荣 | Path planning method and system of pesticide spraying robot |
CN111950502B (en) * | 2020-08-21 | 2024-04-16 | 东软睿驰汽车技术(沈阳)有限公司 | Obstacle object-based detection method and device and computer equipment |
CN112180943B (en) * | 2020-10-19 | 2022-07-01 | 山东交通学院 | Underwater robot navigation obstacle avoidance method based on visual image and laser radar |
CN112148033A (en) * | 2020-10-22 | 2020-12-29 | 广州极飞科技有限公司 | Method, device and equipment for determining unmanned aerial vehicle air route and storage medium |
CN114627398A (en) * | 2020-12-10 | 2022-06-14 | 中国科学院深圳先进技术研究院 | Unmanned aerial vehicle positioning method and system based on screen optical communication |
CN112906479B (en) * | 2021-01-22 | 2024-01-26 | 成都纵横自动化技术股份有限公司 | Unmanned aerial vehicle auxiliary landing method and system thereof |
CN112907629A (en) * | 2021-02-08 | 2021-06-04 | 浙江商汤科技开发有限公司 | Image feature tracking method and device, computer equipment and storage medium |
CN113031648A (en) * | 2021-02-26 | 2021-06-25 | 华南理工大学 | Method for avoiding obstacles of rotor unmanned aerial vehicle based on sensory depth camera |
CN114046796A (en) * | 2021-11-04 | 2022-02-15 | 南京理工大学 | Intelligent wheelchair autonomous walking algorithm, device and medium |
CN114905512A (en) * | 2022-05-16 | 2022-08-16 | 安徽元古纪智能科技有限公司 | Panoramic tracking and obstacle avoidance method and system for intelligent inspection robot |
CN114879729A (en) * | 2022-05-16 | 2022-08-09 | 西北工业大学 | Unmanned aerial vehicle autonomous obstacle avoidance method based on obstacle contour detection algorithm |
CN115576357B (en) * | 2022-12-01 | 2023-07-07 | 浙江大有实业有限公司杭州科技发展分公司 | Full-automatic unmanned aerial vehicle inspection intelligent path planning method under RTK signal-free scene |
CN116820132B (en) * | 2023-07-06 | 2024-01-09 | 杭州牧星科技有限公司 | Flight obstacle avoidance early warning prompting method and system based on remote vision sensor |
CN117170411B (en) * | 2023-11-02 | 2024-02-02 | 山东环维游乐设备有限公司 | Vision assistance-based auxiliary obstacle avoidance method for racing unmanned aerial vehicle |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103196443A (en) * | 2013-04-09 | 2013-07-10 | 王宁羽 | Flight body posture measuring method and system based on light stream and additional information |
CN103365297A (en) * | 2013-06-29 | 2013-10-23 | 天津大学 | Optical flow-based four-rotor unmanned aerial vehicle flight control method |
CN105222760A (en) * | 2015-10-22 | 2016-01-06 | 一飞智控(天津)科技有限公司 | The autonomous obstacle detection system of a kind of unmanned plane based on binocular vision and method |
CN105787447A (en) * | 2016-02-26 | 2016-07-20 | 深圳市道通智能航空技术有限公司 | Method and system of unmanned plane omnibearing obstacle avoidance based on binocular vision |
CN105959627A (en) * | 2016-05-11 | 2016-09-21 | 徐洪恩 | Automatic wireless charging type artificial intelligence unmanned aerial vehicle |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9493235B2 (en) * | 2002-10-01 | 2016-11-15 | Dylan T X Zhou | Amphibious vertical takeoff and landing unmanned device |
US9361706B2 (en) * | 2009-11-30 | 2016-06-07 | Brigham Young University | Real-time optical flow sensor design and its application to obstacle detection |
-
2016
- 2016-11-29 CN CN201611069481.7A patent/CN106681353B/en not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103196443A (en) * | 2013-04-09 | 2013-07-10 | 王宁羽 | Flight body posture measuring method and system based on light stream and additional information |
CN103365297A (en) * | 2013-06-29 | 2013-10-23 | 天津大学 | Optical flow-based four-rotor unmanned aerial vehicle flight control method |
CN105222760A (en) * | 2015-10-22 | 2016-01-06 | 一飞智控(天津)科技有限公司 | The autonomous obstacle detection system of a kind of unmanned plane based on binocular vision and method |
CN105787447A (en) * | 2016-02-26 | 2016-07-20 | 深圳市道通智能航空技术有限公司 | Method and system of unmanned plane omnibearing obstacle avoidance based on binocular vision |
CN105959627A (en) * | 2016-05-11 | 2016-09-21 | 徐洪恩 | Automatic wireless charging type artificial intelligence unmanned aerial vehicle |
Also Published As
Publication number | Publication date |
---|---|
CN106681353A (en) | 2017-05-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106681353B (en) | The unmanned plane barrier-avoiding method and system merged based on binocular vision with light stream | |
Häne et al. | Obstacle detection for self-driving cars using only monocular cameras and wheel odometry | |
US8233660B2 (en) | System and method for object motion detection based on multiple 3D warping and vehicle equipped with such system | |
CN104482934B (en) | The super close distance autonomous navigation device of a kind of Multi-sensor Fusion and method | |
CN111553252B (en) | Road pedestrian automatic identification and positioning method based on deep learning and U-V parallax algorithm | |
EP2948927B1 (en) | A method of detecting structural parts of a scene | |
WO2018086133A1 (en) | Methods and systems for selective sensor fusion | |
CN109472831A (en) | Obstacle recognition range-measurement system and method towards road roller work progress | |
CN109034018A (en) | A kind of low latitude small drone method for barrier perception based on binocular vision | |
CN105225482A (en) | Based on vehicle detecting system and the method for binocular stereo vision | |
CN107677274B (en) | Unmanned plane independent landing navigation information real-time resolving method based on binocular vision | |
CN104215239A (en) | Vision-based autonomous unmanned plane landing guidance device and method | |
CN107688184A (en) | A kind of localization method and system | |
CN110926474A (en) | Satellite/vision/laser combined urban canyon environment UAV positioning and navigation method | |
Hu et al. | Obstacle avoidance methods for rotor UAVs using RealSense camera | |
CN104331901A (en) | TLD-based multi-view target tracking device and method | |
CN113850102B (en) | Vehicle-mounted vision detection method and system based on millimeter wave radar assistance | |
CN114325634A (en) | Method for extracting passable area in high-robustness field environment based on laser radar | |
KR20210034253A (en) | Method and device to estimate location | |
CN112945233B (en) | Global drift-free autonomous robot simultaneous positioning and map construction method | |
Majdik et al. | Micro air vehicle localization and position tracking from textured 3d cadastral models | |
Pfeiffer et al. | Ground truth evaluation of the Stixel representation using laser scanners | |
CN111197986A (en) | Real-time early warning and obstacle avoidance method for three-dimensional path of unmanned aerial vehicle | |
Zheng et al. | Integrated navigation system with monocular vision and LIDAR for indoor UAVs | |
Franke et al. | Towards optimal stereo analysis of image sequences |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20191025 Termination date: 20211129 |