CN111445491A - Three-neighborhood maximum difference value edge detection narrow lane guidance algorithm for micro unmanned aerial vehicle - Google Patents

Three-neighborhood maximum difference value edge detection narrow lane guidance algorithm for micro unmanned aerial vehicle Download PDF

Info

Publication number
CN111445491A
CN111445491A CN202010212200.9A CN202010212200A CN111445491A CN 111445491 A CN111445491 A CN 111445491A CN 202010212200 A CN202010212200 A CN 202010212200A CN 111445491 A CN111445491 A CN 111445491A
Authority
CN
China
Prior art keywords
image
unmanned aerial
algorithm
aerial vehicle
guidance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010212200.9A
Other languages
Chinese (zh)
Other versions
CN111445491B (en
Inventor
徐盼盼
王冠林
史海庆
李德辉
唐宁
王宜东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Zhiyi Aviation Technology Co ltd
Original Assignee
Shandong Zhiyi Aviation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Zhiyi Aviation Technology Co ltd filed Critical Shandong Zhiyi Aviation Technology Co ltd
Priority to CN202010212200.9A priority Critical patent/CN111445491B/en
Publication of CN111445491A publication Critical patent/CN111445491A/en
Application granted granted Critical
Publication of CN111445491B publication Critical patent/CN111445491B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20061Hough transform

Abstract

The invention provides a three-neighborhood maximum difference value edge detection lane guidance algorithm for a micro unmanned aerial vehicle, which belongs to the field of unmanned aerial vehicle intelligent vision and comprises the following steps: three stages of image acquisition, throat detection and guidance calculation; in the image acquisition stage, detecting and extracting a correct image; in the narrow passage detection stage, edge information of an image is obtained through an improved three-neighborhood difference maximum edge detection algorithm, and pixel difference maximum values in three specific directions are obtained; and in the guidance calculation stage, a real narrow path side line is extracted by using a filtering algorithm, and the guidance information is obtained by calculating the angle and position relation between the angle bisector of the narrow path side line and the image aiming line. The invention realizes the full-autonomous guidance of the unmanned aerial vehicle in the unknown map lane, and compared with the traditional multi-camera and multi-sensor scheme, the invention not only ensures the detection precision and the response time, but also simplifies the sensor scheme, prolongs the endurance time of the unmanned aerial vehicle, and can be effectively used for the indoor flight guidance of the micro unmanned aerial vehicle.

Description

Three-neighborhood maximum difference value edge detection narrow lane guidance algorithm for micro unmanned aerial vehicle
Technical Field
The invention relates to the technical field of micro unmanned aerial vehicles, in particular to a three-neighborhood maximum difference edge detection narrow lane guidance algorithm for a micro unmanned aerial vehicle, which is mainly applied to the technical fields of micro unmanned aerial vehicles, robots, automatic driving automobiles and the like.
Background
The existing guidance of the micro unmanned aerial vehicle in the narrow lane is usually achieved through an S L AM algorithm or sensor distance measurement, the former algorithm is complex in calculation and high in power consumption, the latter algorithm increases the load of the unmanned aerial vehicle and is low in precision, and in addition, the increase of the power consumption or the load can shorten the endurance time of the unmanned aerial vehicle, so that the method is not suitable for the micro unmanned aerial vehicle.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: in order to overcome the defects in the prior art, the invention provides a three-neighborhood maximum difference value edge detection lane guidance algorithm for a micro unmanned aerial vehicle, which is a visual guidance algorithm applied to the autonomous flight of the micro unmanned aerial vehicle in an unknown map lane, and can obtain accurate and real-time guidance data to guide the micro unmanned aerial vehicle to autonomously traverse the lane.
The technical scheme adopted for solving the technical problems is as follows: a three-neighborhood maximum difference value edge detection lane guidance algorithm for a micro unmanned aerial vehicle comprises three stages of image acquisition, lane detection and guidance calculation; in the image acquisition stage, detecting and extracting a correct image; in the narrow passage detection stage, edge information of an image is obtained through an improved three-neighborhood difference maximum edge detection algorithm, and pixel difference maximum values in three specific directions are obtained; and in the guidance calculation stage, a median average filtering algorithm is used for extracting a real narrow-path side line, and the guidance information is obtained by calculating the angle and position relation between the angle bisector of the narrow-path side line and the image aiming line.
And in the image acquisition stage, in the process that the micro unmanned aerial vehicle flies in the throat, the airborne forward-looking camera acquires images in real time and judges the correctness of the image data source according to whether the image data is empty or not.
In the stage of detecting the narrow passage, the edge in the image is detected by using an improved three-neighborhood difference maximum algorithm:
f(x,y)=max(|f(x,y)-f(x+1,y)|,|f(x,y)-f(x,y+1)|,|f(x,y)-f(x+1,y+1)|) (1)
wherein f (x, y) represents the gray value of the image (x, y) point, and the maximum value of the absolute value of the difference between the three adjacent points f (x +1, y), f (x, y +1), f (x +1, y +1) of f (x, y) is calculated as the pixel value of f (x, y) point;
according to the selected threshold θ:
Figure BDA0002423210890000021
and obtaining an edge detection binary image of the three-neighborhood difference maximum value edge detection algorithm.
All straight lines in the image are detected by a classical Hough straight line detection algorithm, and a part of non-target straight lines are filtered out by setting parameters of the Hough algorithm.
The classical Hough algorithm flow is roughly as follows, given an object, the kind of shape to be distinguished, the algorithm performs a vote in the parameter space (parameter space) to determine the shape of the object, which is determined by the local maximum (local maximum) in the accumulation space (accumulation space).
The three-neighborhood difference value maximum value algorithm can detect only the edge characteristics in the required direction by detecting the image difference values in the self-defined three directions and taking the maximum value as the edge detection result of the current position, enhances the edge characteristics of the image by maximum value operation, and can effectively solve the image blurring phenomenon generated by the micro unmanned aerial vehicle in the flight process.
And in the guidance calculation stage, distinguishing a left narrow lane side line and a right narrow lane side line according to whether the slope of the result of Hough linear detection is greater than 0, forming two groups of linear clusters, and obtaining two groups of linear equation parameter arrays:
Figure BDA0002423210890000022
and respectively solving a real narrow path sideline for the two groups of parameters by using a median average filtering algorithm, wherein the calculation steps are as follows:
and respectively averaging each group of parameters:
amean=(a0+a1+...an)/n (4)
bmean=(b0+b1+...bn)/n (5)
eliminating the dispersion lines with too large deviation value:
Figure BDA0002423210890000031
Figure BDA0002423210890000032
the weighted average of the set of lines is then found:
Figure BDA0002423210890000033
Figure BDA0002423210890000034
amean=asum/ma(10)
bmean=bsum/mb(11)
wherein ,
Figure BDA0002423210890000035
is a weighting factor, maIs aiNumber of parameters in the array greater than 0, mbIs biThe number of the parameters larger than 0 in the array obtains two narrow lane sidelines through a median average filtering algorithm; finding two stenosisThe angle of the road edge bisects line L.
The angle α between the bisector L of the throat edge line and the vertical boresight of the image is calculated and used as the yaw α for heading guidance.
The bisector L of the included angle of the narrow path side line and the horizontal image line of sight LWIs marked as A, and the image is vertically aligned with the line of sight LHHorizontal line of sight L with imageWThe intersection point of the two is marked as B, and a vector is establishedAnd as a yaw
Figure BDA0002423210890000042
For the correction of the lateral deviation.
Yaw angle α and yaw distance
Figure BDA0002423210890000043
The information is sent to the unmanned aerial vehicle flight control for guiding the unmanned aerial vehicle to realize the crossing flight of the throat; and returning to the first stage again to start a new calculation cycle.
The invention has the beneficial effects that: the invention provides an image guidance algorithm which can be used for providing guidance information for a micro unmanned aerial vehicle in an unknown narrow-lane environment. The image guidance algorithm can utilize the image data to detect and guide the throat by only one forward-looking camera, and can calculate the distance between the centers of the throats of the unmanned aerial vehicles to guide the unmanned aerial vehicles to maintain at the middle points of the throats, acquire the orientation information of the nose and guide the flight direction of the unmanned aerial vehicles to be parallel to the central line of the throat. The improved three-neighborhood difference maximum algorithm is used for obtaining the image edge information, compared with the conventional algorithm, the method screens out the edge information in the required direction by selecting the specific neighborhood difference, and the maximum value in the difference is selected to enhance the edge characteristics of the image, so that the method has a remarkable effect on solving the problem of image blur caused in the flight process of the micro unmanned aerial vehicle; two narrow passage side lines are extracted from a Hough straight line detection result by utilizing a median average filtering algorithm and deleting a straight line with an overlarge offset parameter mean value and a weighted addition mode, and an offset angle and a lateral offset distance are obtained by analyzing the angular bisector of the narrow passage side lines, the image vertical aiming line and the angle and intersection point position relation between the image horizontal aiming lines, so that the micro unmanned aerial vehicle is guided to fly in the middle of the narrow passage, the direction of the aircraft nose can be kept parallel to the central line of the narrow passage in the flying process, and the aircraft body is located at the midpoint position of the narrow passage. The invention reduces the weight of hardware, shortens the response time of the algorithm and improves the detection precision of the algorithm.
Drawings
The invention is further illustrated by the following figures and examples.
Fig. 1 is a flow chart of the algorithm of the present invention, and represents the algorithm execution flow.
Fig. 2(a) -2(d) are image processing flowcharts, in which fig. 2(a) is a diagram of original images of a narrow channel, fig. 2(b) is a diagram of maximum result of difference values of three neighborhoods, fig. 2(c) is a diagram of result of Hough linear detection, and fig. 2(d) is a diagram of final result after median average filtering and guidance information providing in result of Hough linear detection.
Detailed Description
The present invention will now be described in detail with reference to the accompanying drawings. This figure is a simplified schematic diagram, and merely illustrates the basic structure of the present invention in a schematic manner, and therefore it shows only the constitution related to the present invention.
As shown in fig. 1, the invention relates to a three-neighborhood maximum difference edge detection throat guidance algorithm for a micro unmanned aerial vehicle, in particular to a throat flight visual detection and guidance algorithm for a micro unmanned aerial vehicle, and the whole algorithm is realized in three stages:
the first stage, image acquisition, requires two implementation steps, in which,
step 1.1: in the process of flying the micro unmanned aerial vehicle in the throat, the airborne forward-looking camera acquires images in real time, then whether the images are empty is detected, and the step 1 is restarted if the images are empty;
step 1.2: and detecting whether the acquired image data is non-empty, and if the data is non-empty, determining that the correct image data is acquired, and transferring the detected image to the second stage.
The second stage, throat detection, is divided into two implementation steps, wherein,
step 2.1: and detecting edges in the image by using a modified three-neighborhood difference maximum algorithm.
The algorithm obtains an edge detection graph of the image by selecting a maximum value of absolute values of pixel value differences between the image and three adjacent pixels of the image as an edge value of the point;
f(x,y)=max(|f(x,y)-f(x+1,y)|,|f(x,y)-f(x,y+1)|,|f(x,y)-f(x+1,y+1)|) (1)
wherein f (x, y) represents the gray value of the image (x, y) point, and the maximum value of the absolute value of the difference between the three adjacent points f (x +1, y), f (x, y +1), f (x +1, y +1) of f (x, y) is calculated as the pixel value of f (x, y) point;
according to the selected threshold θ:
Figure BDA0002423210890000051
and obtaining an edge detection binary image of the three-neighborhood difference maximum algorithm.
Step 2.2: all lines in the image are detected using the classical Hough line detection algorithm. By setting parameters of the Hough algorithm, a part of non-target straight lines can be filtered.
In the third stage, guidance calculation needs to obtain two guidance information, namely a yaw angle and a lateral deviation, and three steps are specifically needed:
step 3.1: and distinguishing a left narrow lane sideline and a right narrow lane sideline from all the straight lines obtained in the second stage according to whether the slope is greater than or less than a threshold value, forming two groups of straight line clusters, and obtaining two groups of straight line equation parameter arrays:
Figure BDA0002423210890000061
and respectively averaging each group of parameters:
amean=(a0+a1+...an)/n (4)
bmean=(b0+b1+...bn)/n (5)
eliminating the dispersion lines with too large deviation value:
Figure BDA0002423210890000062
Figure BDA0002423210890000063
the weighted average of the set of lines is then found:
Figure BDA0002423210890000064
Figure BDA0002423210890000065
amean=asum/ma(10)
bmean=bsum/mb(11)
Figure BDA0002423210890000071
is a weighting factor, maIs aiNumber of parameters in the array greater than 0, mbIs biAnd calculating the number of the parameters which are larger than 0 in the array to obtain two narrow path side lines, and accordingly obtaining an included angle bisector L of the two narrow path side lines.
And 3.2, calculating an included angle α between a bisector L of the included angle of the sideline of the narrow channel and the vertical aiming line of the image, and taking the included angle as a yaw angle α for heading guidance.
Step 3.3, bisector L of included angle of side line of narrow channel and horizontal aiming line L of imageWIs marked as A, and the image is vertically aligned with the line of sight LHHorizontal line of sight L with imageWThe intersection point of the two is marked as B, and a vector is established
Figure BDA0002423210890000072
And as a yaw
Figure BDA0002423210890000073
For the correction of the lateral deviation.
Yaw angle α and yaw distance
Figure BDA0002423210890000074
The information is sent to the unmanned aerial vehicle flight control for guiding the unmanned aerial vehicle to realize the crossing flight of the throat; and returning to the first stage again to start a new calculation process till the end of a single calculation period.
As shown in fig. 2(a), the corridor is original image, the corridor width is 2.4 m, the corridor length is 15 m, the flying speed is 2m/s, the time for resolving a frame of data by the three-neighborhood maximum difference edge detection corridor guiding algorithm of the micro unmanned aerial vehicle is 40ms, the operation speed reaches 25 frames/s, and the flight requirement can be met.
Fig. 2(b) is a graph of the edge detection result after processing by the three-neighborhood difference maximum algorithm, and the edge features in the graph are well detected.
Fig. 2(c) is a straight line detection result graph of the Hough detection algorithm performed on the edge detection graph, and only a straight line which is longer in the graph and is more continuous in point is obtained by detection through parameter setting of the Hough algorithm.
FIG. 2(d) is a diagram of the final result of Hough line detection after median mean filtering, which is used to calculate the angle α between the bisector L of the narrow-path edge angle and the vertical aiming line of the image, and use the angle α as the yaw angle α for course guidance, and the bisector L of the narrow-path edge angle and the horizontal aiming line L of the imageWIs marked as A, and the image is vertically aligned with the line of sight LHHorizontal line of sight L with imageWThe intersection point of the two is marked as B, and a vector is established
Figure BDA0002423210890000081
And as a yaw
Figure BDA0002423210890000082
For yaw correction, yaw angle α and yaw distance
Figure BDA0002423210890000083
Sending the information to the unmanned aerial vehicle flight control for guiding the unmanned aerial vehicle to realize narrow and narrow rangeThe track-crossing flight.
In light of the foregoing description of preferred embodiments in accordance with the invention, it is to be understood that numerous changes and modifications may be made by those skilled in the art without departing from the scope of the invention. The technical scope of the present invention is not limited to the contents of the specification, and must be determined according to the scope of the claims.

Claims (4)

1. The utility model provides a three very big difference value edge detection narrow lane guide algorithm in neighborhood of miniature unmanned aerial vehicle which characterized in that: the method comprises three stages of image acquisition, throat detection and guidance calculation; in the image acquisition stage, detecting and extracting a correct image; in the narrow passage detection stage, edge information of an image is obtained through an improved three-neighborhood difference maximum edge detection algorithm, and pixel difference maximum values in three specific directions are obtained; and in the guidance calculation stage, a median average filtering algorithm is used for extracting a real narrow-path side line, and the guidance information is obtained by calculating the angle and position relation between the angle bisector of the narrow-path side line and the image aiming line.
2. The unmanned aerial vehicle three-neighborhood maximum difference edge detection throat guidance algorithm of claim 1, wherein: the image acquisition stage specifically comprises the following steps:
in the process of flying the micro unmanned aerial vehicle in the narrow passage, the airborne forward-looking camera acquires images in real time and judges the correctness of the image data source.
3. The unmanned aerial vehicle three-neighborhood maximum difference edge detection throat guidance algorithm of claim 2, wherein: the throat detection stage specifically comprises the following steps:
detecting edges in the image with an improved three-neighborhood difference maximum algorithm:
f(x,y)=max(|f(x,y)-f(x+1,y)|,|f(x,y)-f(x,y+1)|,|f(x,y)-f(x+1,y+1)|) (1)
wherein f (x, y) represents the gray value of the image (x, y) point, and the maximum value of the absolute value of the difference between the three adjacent points f (x +1, y), f (x, y +1), f (x +1, y +1) of f (x, y) is calculated as the pixel value of f (x, y) point;
according to the selected threshold θ:
Figure FDA0002423210880000011
obtaining an edge detection binary image of a three-neighborhood difference maximum edge detection algorithm;
all straight lines in the image are detected by a classical Hough straight line detection algorithm, and a part of non-target straight lines are filtered out by setting parameters of the Hough algorithm.
4. The unmanned aerial vehicle three-neighborhood maximum difference edge detection throat guidance algorithm of claim 3, wherein: the guidance calculation stage specifically comprises the following steps:
firstly, distinguishing a left narrow-path side line and a right narrow-path side line according to whether the slope of the Hough linear detection result is greater than 0, forming two groups of linear clusters, and obtaining two groups of linear equation parameter arrays:
Figure FDA0002423210880000021
and respectively solving a real narrow path sideline for the two groups of parameters by using a median average filtering algorithm, wherein the calculation steps are as follows:
and respectively averaging each group of parameters:
amean=(a0+a1+...an)/n (4)
bmean=(b0+b1+...bn)/n (5)
eliminating the dispersion lines with too large deviation value:
Figure FDA0002423210880000022
Figure FDA0002423210880000023
the weighted average of the set of lines is then found:
Figure FDA0002423210880000024
Figure FDA0002423210880000025
amean=asum/ma(10)
bmean=bsum/mb(11)
wherein ,
Figure FDA0002423210880000031
is a weighting factor, maIs aiNumber of parameters in the array greater than 0, mbIs biObtaining two narrow path side lines by the number of the parameters which are more than 0 in the array through a median average filtering algorithm, and solving an included angle bisector L of the two narrow path side lines;
calculating an included angle α between a bisector L of a sideline included angle of the narrow channel and a vertical aiming line of the image, and taking the included angle as a yaw angle α for heading guidance;
the bisector L of the included angle of the narrow path side line and the horizontal image line of sight LWIs marked as A, and the image is vertically aligned with the line of sight LHHorizontal line of sight L with imageWThe intersection point of the two is marked as B, and a vector is established
Figure FDA0002423210880000032
And as a yaw
Figure FDA0002423210880000033
Used for lateral deviation correction;
yaw angle α and yaw distance
Figure FDA0002423210880000034
SendingThe unmanned aerial vehicle is subjected to flight control and is used for guiding the unmanned aerial vehicle to realize passage flight of a throat; and returning to the first stage again to start a new calculation cycle.
CN202010212200.9A 2020-03-24 2020-03-24 Three-neighborhood maximum difference edge detection narrow channel guiding method for miniature unmanned aerial vehicle Active CN111445491B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010212200.9A CN111445491B (en) 2020-03-24 2020-03-24 Three-neighborhood maximum difference edge detection narrow channel guiding method for miniature unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010212200.9A CN111445491B (en) 2020-03-24 2020-03-24 Three-neighborhood maximum difference edge detection narrow channel guiding method for miniature unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN111445491A true CN111445491A (en) 2020-07-24
CN111445491B CN111445491B (en) 2023-09-15

Family

ID=71629675

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010212200.9A Active CN111445491B (en) 2020-03-24 2020-03-24 Three-neighborhood maximum difference edge detection narrow channel guiding method for miniature unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN111445491B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101447077A (en) * 2008-12-18 2009-06-03 浙江大学 Edge detection method of color textile texture image oriented to textile industry
CN101751564A (en) * 2010-02-04 2010-06-23 华南理工大学 Intravenous grain extraction method based on maximal intra-neighbor difference vector diagram
CN103530878A (en) * 2013-10-12 2014-01-22 北京工业大学 Edge extraction method based on fusion strategy
CN103925920A (en) * 2014-04-10 2014-07-16 西北工业大学 Image perspective-based micro unmanned aerial vehicle indoor autonomous navigation method
CN104952081A (en) * 2015-07-20 2015-09-30 电子科技大学 COG (Chip-On-Glass) offset detection method based on extreme value difference statistical characteristic
CN105938556A (en) * 2016-04-22 2016-09-14 复旦大学 Wide line detection algorithm based on water flow method
WO2018018987A1 (en) * 2016-07-29 2018-02-01 深圳市未来媒体技术研究院 Calibration pre-processing method for light field camera
WO2019041590A1 (en) * 2017-08-31 2019-03-07 中国科学院微电子研究所 Edge detection method using arbitrary angle
CN109916394A (en) * 2019-04-04 2019-06-21 山东智翼航空科技有限公司 A kind of Integrated Navigation Algorithm merging optical flow position and velocity information
CN109931926A (en) * 2019-04-04 2019-06-25 山东智翼航空科技有限公司 A kind of small drone based on topocentric coordinate system is seamless self-aid navigation algorithm
CN110298216A (en) * 2018-03-23 2019-10-01 中国科学院沈阳自动化研究所 Vehicle deviation warning method based on lane line gradient image adaptive threshold fuzziness

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101447077A (en) * 2008-12-18 2009-06-03 浙江大学 Edge detection method of color textile texture image oriented to textile industry
CN101751564A (en) * 2010-02-04 2010-06-23 华南理工大学 Intravenous grain extraction method based on maximal intra-neighbor difference vector diagram
CN103530878A (en) * 2013-10-12 2014-01-22 北京工业大学 Edge extraction method based on fusion strategy
CN103925920A (en) * 2014-04-10 2014-07-16 西北工业大学 Image perspective-based micro unmanned aerial vehicle indoor autonomous navigation method
CN104952081A (en) * 2015-07-20 2015-09-30 电子科技大学 COG (Chip-On-Glass) offset detection method based on extreme value difference statistical characteristic
CN105938556A (en) * 2016-04-22 2016-09-14 复旦大学 Wide line detection algorithm based on water flow method
WO2018018987A1 (en) * 2016-07-29 2018-02-01 深圳市未来媒体技术研究院 Calibration pre-processing method for light field camera
WO2019041590A1 (en) * 2017-08-31 2019-03-07 中国科学院微电子研究所 Edge detection method using arbitrary angle
CN110298216A (en) * 2018-03-23 2019-10-01 中国科学院沈阳自动化研究所 Vehicle deviation warning method based on lane line gradient image adaptive threshold fuzziness
CN109916394A (en) * 2019-04-04 2019-06-21 山东智翼航空科技有限公司 A kind of Integrated Navigation Algorithm merging optical flow position and velocity information
CN109931926A (en) * 2019-04-04 2019-06-25 山东智翼航空科技有限公司 A kind of small drone based on topocentric coordinate system is seamless self-aid navigation algorithm

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
邢静宇;高需;韩璞: "一种Hough变换与蚁群优化的云图像边缘检测算法", 微电子学与计算机 *

Also Published As

Publication number Publication date
CN111445491B (en) 2023-09-15

Similar Documents

Publication Publication Date Title
CN108960183B (en) Curve target identification system and method based on multi-sensor fusion
CN108596129B (en) Vehicle line-crossing detection method based on intelligent video analysis technology
CN108647646B (en) Low-beam radar-based short obstacle optimized detection method and device
Kong et al. Vanishing point detection for road detection
CN108230254B (en) Automatic detection method for high-speed traffic full lane line capable of self-adapting scene switching
WO2015010451A1 (en) Method for road detection from one image
CN110569704A (en) Multi-strategy self-adaptive lane line detection method based on stereoscopic vision
CN105511462B (en) A kind of AGV air navigation aids of view-based access control model
CN104282020A (en) Vehicle speed detection method based on target motion track
CN103324936B (en) A kind of vehicle lower boundary detection method based on Multi-sensor Fusion
CN107679520A (en) A kind of lane line visible detection method suitable for complex condition
CN109085823A (en) The inexpensive automatic tracking running method of view-based access control model under a kind of garden scene
CN110379168B (en) Traffic vehicle information acquisition method based on Mask R-CNN
WO1997025700A1 (en) Traffic congestion measuring method and apparatus and image processing method and apparatus
CN111680713B (en) Unmanned aerial vehicle ground target tracking and approaching method based on visual detection
CN108009494A (en) A kind of intersection wireless vehicle tracking based on unmanned plane
CN115049700A (en) Target detection method and device
DE112019001542T5 (en) POSITION ESTIMATE DEVICE
CN106446785A (en) Passable road detection method based on binocular vision
CN114399748A (en) Agricultural machinery real-time path correction method based on visual lane detection
CN115308732A (en) Multi-target detection and tracking method integrating millimeter wave radar and depth vision
CN105300390B (en) The determination method and device of obstructing objects movement locus
CN104156977A (en) Point target movement velocity detection method based on multiple linear moveout scanning, extending and sampling
CN110610130A (en) Multi-sensor information fusion power transmission line robot navigation method and system
Suto Real-time lane line tracking algorithm to mini vehicles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant