CN110196062B - Navigation method for tracking lane line by single camera - Google Patents
Navigation method for tracking lane line by single camera Download PDFInfo
- Publication number
- CN110196062B CN110196062B CN201910564812.1A CN201910564812A CN110196062B CN 110196062 B CN110196062 B CN 110196062B CN 201910564812 A CN201910564812 A CN 201910564812A CN 110196062 B CN110196062 B CN 110196062B
- Authority
- CN
- China
- Prior art keywords
- image
- robot
- line
- lane
- inclination angle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 19
- 238000006243 chemical reaction Methods 0.000 claims abstract description 12
- 238000000605 extraction Methods 0.000 claims description 12
- 230000011218 segmentation Effects 0.000 claims description 12
- 238000001514 detection method Methods 0.000 claims description 11
- 230000000007 visual effect Effects 0.000 claims description 7
- 238000010586 diagram Methods 0.000 description 4
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000003902 lesion Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Image Analysis (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention discloses a navigation method for tracking a lane line by a single camera, which comprises the following steps: calibrating to obtain the pixel distance conversion proportion of each inclination angle; adjusting the inclination angle between the central line of the CCD camera and the axis of the supporting rod, and shooting an image at any inclination angle; identifying a lane line in any image, and selecting an initial inclination angle; adjusting the inclination angle between the central line of the CCD camera and the axis of the supporting rod to an initial inclination angle; obtaining an included angle between a lane line in the image and a longitudinal center line of the image under the initial inclination angle; adjusting the advancing angle of the robot to enable the included angle between the lane line in the image and the longitudinal center line of the image to be zero; obtaining the pixel distance between a lane line and the longitudinal center line of the image and the distance between the longitudinal center line of the image and the central axis of the robot in the image; obtaining the distance of the lane line deviating from the robot at any moment, and adjusting the distance to a preset distance of the robot deviating from the lane; the robot travels in the lane direction until the end point.
Description
Technical Field
The invention relates to the technical field of road detection, in particular to a navigation method for tracking a lane line by a single camera.
Background
In tunnels and municipal road scenes, automatic equipment such as robots cannot use satellite-based positioning navigation due to weak satellite positioning signals or blind areas and the like. In robotic or other automated equipment work applications, it is often necessary to perform regular path actions along a lane line, such as a road sweeping back and forth across an interior lesion of the road in a zig-zag pattern. Although it is well established that the unmanned technology controls the vehicle to run straight ahead in the center of the lane by using the lane line; for example, the application number 201710086674.1 entitled "a vehicle lane line detection system and method" discloses that a lane line is obtained by adopting a threshold segmentation detection and straight line extraction algorithm; however, the prior art is not suitable for automatic equipment to position and navigate on a road surface according to a customized path, such as running parallel to and deviated from a lane line by N meters (N is a requirement of a work task).
Therefore, a navigation method for tracking the lane line by using a single camera, which has a simple structure and is convenient to detect, is urgently needed.
Disclosure of Invention
In view of the above problems, an object of the present invention is to provide a navigation method for tracking a lane line with a single camera, and the technical solution adopted by the present invention is as follows:
a navigation method for tracking a lane line by a single camera comprises a robot which travels along a lane direction and has any travel angle, and a lane line tracking visual component which is arranged at the front part of the robot and is superposed with the central axis of the robot; the lane line tracking visual assembly comprises a supporting rod, an inclined servo motor, a connecting column and a CCD camera, wherein the bottom of the supporting rod is arranged at the front part of the robot, the inclined servo motor is arranged at the top of the supporting rod, the connecting column is connected with a rotating shaft of the inclined servo motor, and the CCD camera is arranged on the connecting column and is coincided with the central axis of the robot;
the navigation method comprises the following steps:
step S1, obtaining a pixel distance conversion ratio f in the nth interval under the inclination angle m by using the calibration of uniformly spaced black and white stripes parallel to the central axis of the robotmn(ii) a Obtaining the distance L between the longitudinal central line of the image and the central axis of the robot under the inclination angle mm(ii) a Presetting a lane departure distance T of the robot; the inclination angle m is greater than-45 degrees and less than 45 degrees;
step S2, selecting a starting point of lane detection, adjusting an inclination angle m between the center line of the CCD camera and the axis of the supporting rod by using an inclination servo motor, and shooting an image of any inclination angle m;
step S3, identifying lane lines in the image by using a threshold segmentation and straight line extraction algorithm, selecting an inclination angle corresponding to the image with the minimum distance difference between the lane lines and the center line of the image in the image, and marking the inclination angle as an initial inclination angle i;
step S4, adjusting the inclination angle between the central line of the CCD camera and the axis of the supporting rod to an initial inclination angle i, and shooting a moving image by using the CCD camera;
step S5, recognizing lane lines in the image by using a threshold segmentation and straight line extraction algorithm, and obtaining an included angle between the lane lines in the image and a longitudinal center line of the image under the initial inclination angle i; the longitudinal center line of the image is parallel to the central axis of the robot;
step S6, adjusting the advancing angle of the robot, and repeating the step S5 until the included angle between the lane line in the image and the longitudinal center line of the image is zero;
step S7, obtaining the pixel distance S between the lane line and the longitudinal central line of the image and the distance L between the longitudinal central line of the image and the central axis of the roboti(ii) a Obtaining that the lane line is positioned in the jth interval in the image according to the pixel distance S; the conversion proportion of the pixel distance in the jth interval under the initial inclination angle i is fij(ii) a Obtaining the distance f between the lane line and the robot at any timeij×S+Li;
Step S8, adjusting the translational motion of the robot, and repeating the step S7 until the lane line deviation distance is adjusted to the preset lane deviation distance T of the robot;
in step S9, the robot travels in the lane direction, and repeats steps S5 to S8 until the robot travels to the end of the lane detection.
Further, in the step S1, the pixel distance conversion ratio f in the nth interval at the inclination angle m is obtainedmnThe method specifically comprises the following steps:
step S11, using the black and white stripes with uniform intervals parallel to the central axis of the robot, and under the condition of the inclination angle m, measuring the distance L between the stripes corresponding to the longitudinal central line of the image and the central axis of the robotmAnd the actual distance DX of the nth stripe line from the center of the stripe corresponding to the longitudinal centerline of the imagemn;
Step S12, calculating the pixel distance dx between the nth stripe line in the image and the longitudinal central line of the image by using threshold segmentation and straight line extraction algorithmmn(ii) a Pixel distance conversion ratio f in nth intervalmnThe expression of (A) is:
fmn=dxmn/DXmn。
preferably, the inclination angle m is one of-40 °, -30 °, -20 °, -10 °, 0 °, 10 °, 20 °, 30 °, 40 °.
Compared with the prior art, the invention has the following beneficial effects:
(1) the invention utilizes a single CCD camera to shoot the image of the road surface at an initial inclination angle, automatically identifies the lane line, converts the horizontal distance and the rotation angle of the lane line relative to the vehicle body according to the pre-calibration result under the initial inclination angle by measuring the distance and the rotation angle of the lane line relative to the longitudinal center line of the image, and adjusts the running posture of the robot and the offset distance from the lane line according to the requirement of the operation task, thereby realizing the autonomous navigation of the robot.
(2) The invention obtains the pixel distance conversion proportion in any interval by utilizing the calibration of black and white stripes with uniform intervals, and obtains the distance of the lane line deviating from the robot at any moment so as to adjust the distance of the lane line deviating from the robot at any moment to the preset distance of the lane line deviating from the robot, thereby leading the robot to accurately track the lane line to advance.
(3) The invention ensures that the robot travels along the line parallel to the lane line at any moment by adjusting the traveling angle of the robot to enable the lane line in the image to be parallel to the longitudinal center line of the image, thereby effectively avoiding the robot from deviating the lane line.
(4) The method identifies the lane lines in the image by using a threshold segmentation and straight line extraction algorithm, and selects the inclination angle corresponding to the image with the minimum distance difference between the lane lines and the central line of the image, thereby ensuring the effectiveness and accuracy of the shot image.
In conclusion, the method has the advantages of simple logic, accurate tracking and the like, and has high practical value and popularization value in the technical field of road detection.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention, and therefore should not be considered as limiting the scope of protection, and it is obvious for those skilled in the art that other related drawings can be obtained according to these drawings without inventive efforts.
FIG. 1 is a schematic structural diagram of the present invention.
FIG. 2 is a schematic diagram of the configuration of the lane line tracking vision assembly of the present invention.
Fig. 3 is a schematic diagram of the inclination angle between the CCD camera and the supporting rod according to the present invention.
FIG. 4 is a schematic diagram of pixel distance scaling calibration according to the present invention.
FIG. 5 is a flow chart of the tracking navigation method of the present invention.
In the drawings, the names of the parts corresponding to the reference numerals are as follows:
the system comprises a lane line 1, a lane line 2, a robot 3, a lane line tracking visual component, a support rod 31, a tilt servo motor 32, a connecting column 33 and a CCD camera 34.
Detailed Description
To further clarify the objects, technical solutions and advantages of the present application, the present invention will be further described with reference to the accompanying drawings and examples, and embodiments of the present invention include, but are not limited to, the following examples. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Examples
As shown in fig. 1 to 5, this embodiment provides a navigation method for tracking a lane line with a single camera, and it should be noted that in this embodiment, directional terms such as "bottom", "top", "peripheral edge", "center", and the like are explained based on the drawings. In addition, the threshold segmentation and straight line extraction algorithm of the present embodiment belongs to the prior art, and is not described herein again.
The hardware related to the embodiment comprises a robot 2 which travels along the lane direction and has any travel angle, and a lane line 1 tracking visual component 3 which is arranged at the front part of the robot 2 and is superposed with the central axis of the robot 2; the lane line tracking visual component 3 comprises a support rod 31, an inclined servo motor 32, a connecting column 33 and a CCD camera 34, wherein the bottom of the support rod 31 is installed at the front part of the robot 2, the inclined servo motor 32 is arranged at the top of the support rod 31, the connecting column 33 is connected with a rotating shaft of the inclined servo motor 32, and the CCD camera 34 is installed on the connecting column 33 and coincides with the central axis of the robot 2.
Briefly described below, the navigation method for tracking the lane line by a single camera includes the following steps:
firstly, a pixel distance conversion proportion f in the nth interval under the inclination angle m is obtained by utilizing the calibration of uniformly spaced black and white stripes parallel to the central axis of the robotmn(ii) a Obtaining the distance L between the longitudinal central line of the image and the central axis of the robot under the inclination angle mm(ii) a Presetting a lane departure distance T of the robot; the inclination angle m is one of-40 degrees, -30 degrees, -20 degrees, -10 degrees, 0 degrees, 10 degrees, 20 degrees, 30 degrees and 40 degrees.
(1) Measuring the distance L between the stripes corresponding to the longitudinal central line of the image and the central axis of the robot by using black and white stripes which are parallel to the central axis of the robot and are evenly spaced, and under the condition that the inclination angle is mmAnd the actual distance DX of the nth stripe line from the center of the stripe corresponding to the longitudinal centerline of the imagemn;
(2) Calculating the pixel distance dx between the nth stripe line in the image and the longitudinal central line of the image by using threshold segmentation and a straight line extraction algorithmmn(ii) a Pixel distance conversion ratio f in nth intervalmnThe expression of (A) is:
fmn=dxmn/DXmn。
here, the nth section means that the image pixel is located at a pixel distance [ dx ] from the longitudinal center line of the image in the lateral direction of the imagem(n-1),dxmn]All pixels within.
Secondly, selecting a starting point of lane detection, adjusting an inclination angle m between the center line of the CCD camera and the axis of the supporting rod by using an inclination servo motor, and shooting an image of any inclination angle m;
and thirdly, recognizing the lane lines in the image by using a threshold segmentation and straight line extraction algorithm, selecting the inclination angle corresponding to the image with the minimum distance difference between the lane lines and the central line of the image in the image, and marking the inclination angle as an initial inclination angle i.
And fourthly, adjusting the inclination angle between the central line of the CCD camera and the axis of the supporting rod to the initial inclination angle i, and shooting an advancing image by using the CCD camera.
Fifthly, recognizing lane lines in the image by using a threshold segmentation and straight line extraction algorithm, and solving an included angle between the lane lines in the image and a longitudinal center line of the image under the initial inclination angle i; the longitudinal centerline of the image is parallel to the central axis of the robot.
Sixthly, adjusting the advancing angle of the robot, and repeating the fifth step until the included angle between the lane line in the image and the longitudinal center line of the image is zero;
seventhly, the pixel distance S between the lane line and the longitudinal center line of the image in the image and the distance L between the longitudinal center line of the image and the central axis of the robot are obtainedi(ii) a Obtaining that the lane line is positioned in the jth interval in the image according to the pixel distance S; the conversion proportion of the pixel distance in the jth interval under the initial inclination angle i is fij(ii) a Obtaining the distance f between the lane line and the robot at any timeij×S+Li。
And eighthly, adjusting the translational motion of the robot, and repeating the seventh step until the lane line deviation distance of the robot is adjusted to the preset lane deviation distance T of the robot.
And step nine, the robot advances along the lane direction, and the fifth step to the eighth step are repeated until the robot drives to the end point of the lane detection.
In conclusion, the invention fills the technical blank that the road surface cannot be positioned and tracked in the scenes with weak satellite positioning signals or blind areas, such as tunnels, municipal roads and the like, has outstanding substantive characteristics and obvious progress compared with the prior art, and has high practical value and popularization value in the technical field of road detection.
The above-mentioned embodiments are only preferred embodiments of the present invention, and do not limit the scope of the present invention, but all the modifications made by the principles of the present invention and the non-inventive efforts based on the above-mentioned embodiments shall fall within the scope of the present invention.
Claims (3)
1. A navigation method for tracking a lane line by a single camera comprises a robot (2) which travels along the lane direction and has any travel angle, and is characterized by further comprising a lane line tracking visual component (3) which is arranged at the front part of the robot (2) and is superposed with the central axis of the robot (2); the lane line tracking visual assembly (3) comprises a supporting rod (31) with the bottom installed at the front part of the robot (2), an inclined servo motor (32) arranged at the top of the supporting rod (31), a connecting column (33) connected with a rotating shaft of the inclined servo motor (32), and a CCD camera (34) which is installed on the connecting column (33) and is overlapped with the central axis of the robot (2); the navigation method comprises the following steps:
step S1, obtaining a pixel distance conversion ratio f in the nth interval under the inclination angle m by using the calibration of uniformly spaced black and white stripes parallel to the central axis of the robotmn(ii) a Obtaining the distance L between the longitudinal central line of the image and the central axis of the robot under the inclination angle mm(ii) a Presetting a lane departure distance T of the robot; the inclination angle m is greater than-45 degrees and less than 45 degrees;
step S2, selecting a starting point of lane detection, adjusting an inclination angle m between the center line of the CCD camera and the axis of the supporting rod by using an inclination servo motor, and shooting an image of any inclination angle m;
step S3, identifying lane lines in the image by using a threshold segmentation and straight line extraction algorithm, selecting an inclination angle corresponding to the image with the minimum distance difference between the lane lines and the center line of the image in the image, and marking the inclination angle as an initial inclination angle i;
step S4, adjusting the inclination angle between the central line of the CCD camera and the axis of the supporting rod to an initial inclination angle i, and shooting a moving image by using the CCD camera;
step S5, recognizing lane lines in the image by using a threshold segmentation and straight line extraction algorithm, and obtaining an included angle between the lane lines in the image and a longitudinal center line of the image under the initial inclination angle i; the longitudinal center line of the image is parallel to the central axis of the robot;
step S6, adjusting the advancing angle of the robot, and repeating the step S5 until the included angle between the lane line in the image and the longitudinal center line of the image is zero;
step S7, obtaining the pixel distance S between the lane line and the longitudinal central line of the image and the distance L between the longitudinal central line of the image and the central axis of the roboti(ii) a Obtaining that the lane line is positioned in the jth interval in the image according to the pixel distance S; the conversion proportion of the pixel distance in the jth interval under the initial inclination angle i is fij(ii) a Obtaining the distance f between the lane line and the robot at any timeij×S+Li;
Step S8, adjusting the translational motion of the robot, and repeating the step S7 until the lane line deviation distance is adjusted to the preset lane deviation distance T of the robot;
in step S9, the robot travels in the lane direction, and repeats steps S5 to S8 until the robot travels to the end of the lane detection.
2. The navigation method for tracking the lane line with a single camera according to claim 1, wherein the step S1 specifically includes the following steps:
step S11, using the black and white stripes with uniform intervals parallel to the central axis of the robot, and under the condition of the inclination angle m, measuring the distance L between the stripes corresponding to the longitudinal central line of the image and the central axis of the robotmAnd the nth stripe lineActual distance DX from center of stripe corresponding to longitudinal centerline of imagemn;
Step S12, calculating the pixel distance dx between the nth stripe line in the image and the longitudinal central line of the image by using threshold segmentation and straight line extraction algorithmmn(ii) a Pixel distance conversion ratio f in nth intervalmnThe expression of (A) is:
fmn=dxmn/DXmn。
3. the navigation method for tracking a lane line with a single camera according to claim 1, wherein the tilt angle m is one of-40 °, -30 °, -20 °, -10 °, 0 °, 10 °, 20 °, 30 °, and 40 °.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910564812.1A CN110196062B (en) | 2019-06-27 | 2019-06-27 | Navigation method for tracking lane line by single camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910564812.1A CN110196062B (en) | 2019-06-27 | 2019-06-27 | Navigation method for tracking lane line by single camera |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110196062A CN110196062A (en) | 2019-09-03 |
CN110196062B true CN110196062B (en) | 2022-03-25 |
Family
ID=67755366
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910564812.1A Active CN110196062B (en) | 2019-06-27 | 2019-06-27 | Navigation method for tracking lane line by single camera |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110196062B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109389650B (en) | 2018-09-30 | 2021-01-12 | 京东方科技集团股份有限公司 | Vehicle-mounted camera calibration method and device, vehicle and storage medium |
CN110865663B (en) * | 2019-12-05 | 2022-11-15 | 成都圭目机器人有限公司 | Novel speed compensation torque balance control method applied to four-wheel-drive four-wheel robot |
CN111273674A (en) * | 2020-03-12 | 2020-06-12 | 深圳冰河导航科技有限公司 | Distance measurement method, vehicle operation control method and control system |
CN111896012A (en) * | 2020-03-15 | 2020-11-06 | 上海谕培汽车科技有限公司 | Vehicle-mounted navigation method based on machine vision |
CN114167850B (en) * | 2020-08-21 | 2024-02-20 | 富联精密电子(天津)有限公司 | Self-propelled triangular warning frame and travelling control method thereof |
Citations (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5911767A (en) * | 1994-10-04 | 1999-06-15 | Garibotto; Giovanni | Navigation system for an autonomous mobile robot |
CN101776438A (en) * | 2010-01-26 | 2010-07-14 | 武汉理工大学 | Measuring device and method of road mark |
CN102661733A (en) * | 2012-05-28 | 2012-09-12 | 天津工业大学 | Front vehicle ranging method based on monocular vision |
CN103488976A (en) * | 2013-09-17 | 2014-01-01 | 北京联合大学 | Stop mark real-time detection and distance measurement method based on intelligent driving |
CN103630122A (en) * | 2013-10-15 | 2014-03-12 | 北京航天科工世纪卫星科技有限公司 | Monocular vision lane line detection method and distance measurement method thereof |
CN103991449A (en) * | 2014-06-12 | 2014-08-20 | 北京联合大学 | Vehicle travelling control method and system |
CN104992145A (en) * | 2015-06-15 | 2015-10-21 | 山东大学 | Moment sampling lane tracking detection method |
CN105460009A (en) * | 2015-11-30 | 2016-04-06 | 奇瑞汽车股份有限公司 | Automobile control method and device |
CN105488454A (en) * | 2015-11-17 | 2016-04-13 | 天津工业大学 | Monocular vision based front vehicle detection and ranging method |
CN105651286A (en) * | 2016-02-26 | 2016-06-08 | 中国科学院宁波材料技术与工程研究所 | Visual navigation method and system of mobile robot as well as warehouse system |
CN105667518A (en) * | 2016-02-25 | 2016-06-15 | 福州华鹰重工机械有限公司 | Lane detection method and device |
CN105868469A (en) * | 2016-03-28 | 2016-08-17 | 湖南大学 | Lane departure forewarning method based on perspective image and forewarning model construction method |
CN106740841A (en) * | 2017-02-14 | 2017-05-31 | 驭势科技(北京)有限公司 | Method for detecting lane lines, device and mobile unit based on dynamic control |
CN106828489A (en) * | 2017-02-14 | 2017-06-13 | 中国科学院自动化研究所 | A kind of vehicle travel control method and device |
CN106874875A (en) * | 2017-02-17 | 2017-06-20 | 武汉理工大学 | A kind of vehicle-mounted lane detection system and method |
CN106981082A (en) * | 2017-03-08 | 2017-07-25 | 驭势科技(北京)有限公司 | Vehicle-mounted camera scaling method, device and mobile unit |
CN107229908A (en) * | 2017-05-16 | 2017-10-03 | 浙江理工大学 | A kind of method for detecting lane lines |
CN206623754U (en) * | 2017-02-14 | 2017-11-10 | 驭势科技(北京)有限公司 | Lane detection device |
CN107389026A (en) * | 2017-06-12 | 2017-11-24 | 江苏大学 | A kind of monocular vision distance-finding method based on fixing point projective transformation |
CN108362205A (en) * | 2017-11-14 | 2018-08-03 | 沈阳工业大学 | Space ranging method based on fringe projection |
CN108664016A (en) * | 2017-03-31 | 2018-10-16 | 腾讯科技(深圳)有限公司 | Determine the method and device of lane center |
CN109062213A (en) * | 2018-08-16 | 2018-12-21 | 郑州轻工业学院 | A method of the intelligent vehicle automatic Pilot based on modified proportional guidance |
CN109145860A (en) * | 2018-09-04 | 2019-01-04 | 百度在线网络技术(北京)有限公司 | Lane line tracking and device |
CN109389650A (en) * | 2018-09-30 | 2019-02-26 | 京东方科技集团股份有限公司 | A kind of scaling method of in-vehicle camera, device, vehicle and storage medium |
CN109477728A (en) * | 2016-07-27 | 2019-03-15 | 大众汽车有限公司 | For determining method, apparatus of the vehicle relative to the lateral position in the lane on road surface and the computer readable storage medium with instruction |
CN109544645A (en) * | 2018-11-27 | 2019-03-29 | 苏州杰锐思自动化设备有限公司 | The method of camera module group lens inclination angle calibration |
CN109785667A (en) * | 2019-03-11 | 2019-05-21 | 百度在线网络技术(北京)有限公司 | Deviation recognition methods, device, equipment and storage medium |
-
2019
- 2019-06-27 CN CN201910564812.1A patent/CN110196062B/en active Active
Patent Citations (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5911767A (en) * | 1994-10-04 | 1999-06-15 | Garibotto; Giovanni | Navigation system for an autonomous mobile robot |
CN101776438A (en) * | 2010-01-26 | 2010-07-14 | 武汉理工大学 | Measuring device and method of road mark |
CN102661733A (en) * | 2012-05-28 | 2012-09-12 | 天津工业大学 | Front vehicle ranging method based on monocular vision |
CN103488976A (en) * | 2013-09-17 | 2014-01-01 | 北京联合大学 | Stop mark real-time detection and distance measurement method based on intelligent driving |
CN103630122A (en) * | 2013-10-15 | 2014-03-12 | 北京航天科工世纪卫星科技有限公司 | Monocular vision lane line detection method and distance measurement method thereof |
CN103991449A (en) * | 2014-06-12 | 2014-08-20 | 北京联合大学 | Vehicle travelling control method and system |
CN104992145A (en) * | 2015-06-15 | 2015-10-21 | 山东大学 | Moment sampling lane tracking detection method |
CN105488454A (en) * | 2015-11-17 | 2016-04-13 | 天津工业大学 | Monocular vision based front vehicle detection and ranging method |
CN105460009A (en) * | 2015-11-30 | 2016-04-06 | 奇瑞汽车股份有限公司 | Automobile control method and device |
CN105667518A (en) * | 2016-02-25 | 2016-06-15 | 福州华鹰重工机械有限公司 | Lane detection method and device |
CN105651286A (en) * | 2016-02-26 | 2016-06-08 | 中国科学院宁波材料技术与工程研究所 | Visual navigation method and system of mobile robot as well as warehouse system |
CN105868469A (en) * | 2016-03-28 | 2016-08-17 | 湖南大学 | Lane departure forewarning method based on perspective image and forewarning model construction method |
CN109477728A (en) * | 2016-07-27 | 2019-03-15 | 大众汽车有限公司 | For determining method, apparatus of the vehicle relative to the lateral position in the lane on road surface and the computer readable storage medium with instruction |
CN106740841A (en) * | 2017-02-14 | 2017-05-31 | 驭势科技(北京)有限公司 | Method for detecting lane lines, device and mobile unit based on dynamic control |
CN106828489A (en) * | 2017-02-14 | 2017-06-13 | 中国科学院自动化研究所 | A kind of vehicle travel control method and device |
CN206623754U (en) * | 2017-02-14 | 2017-11-10 | 驭势科技(北京)有限公司 | Lane detection device |
CN106874875A (en) * | 2017-02-17 | 2017-06-20 | 武汉理工大学 | A kind of vehicle-mounted lane detection system and method |
CN106981082A (en) * | 2017-03-08 | 2017-07-25 | 驭势科技(北京)有限公司 | Vehicle-mounted camera scaling method, device and mobile unit |
CN108664016A (en) * | 2017-03-31 | 2018-10-16 | 腾讯科技(深圳)有限公司 | Determine the method and device of lane center |
CN107229908A (en) * | 2017-05-16 | 2017-10-03 | 浙江理工大学 | A kind of method for detecting lane lines |
CN107389026A (en) * | 2017-06-12 | 2017-11-24 | 江苏大学 | A kind of monocular vision distance-finding method based on fixing point projective transformation |
CN108362205A (en) * | 2017-11-14 | 2018-08-03 | 沈阳工业大学 | Space ranging method based on fringe projection |
CN109062213A (en) * | 2018-08-16 | 2018-12-21 | 郑州轻工业学院 | A method of the intelligent vehicle automatic Pilot based on modified proportional guidance |
CN109145860A (en) * | 2018-09-04 | 2019-01-04 | 百度在线网络技术(北京)有限公司 | Lane line tracking and device |
CN109389650A (en) * | 2018-09-30 | 2019-02-26 | 京东方科技集团股份有限公司 | A kind of scaling method of in-vehicle camera, device, vehicle and storage medium |
CN109544645A (en) * | 2018-11-27 | 2019-03-29 | 苏州杰锐思自动化设备有限公司 | The method of camera module group lens inclination angle calibration |
CN109785667A (en) * | 2019-03-11 | 2019-05-21 | 百度在线网络技术(北京)有限公司 | Deviation recognition methods, device, equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN110196062A (en) | 2019-09-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110196062B (en) | Navigation method for tracking lane line by single camera | |
CN107229908B (en) | A kind of method for detecting lane lines | |
JP6533619B2 (en) | Sensor calibration system | |
US10392756B2 (en) | Roadway marker control system | |
US20180283892A1 (en) | Automated image labeling for vehicles based on maps | |
CN107251055B (en) | Corridor capture | |
US20180067494A1 (en) | Automated-vehicle 3d road-model and lane-marking definition system | |
EP0446902B1 (en) | Automatic travelling apparatus | |
CN108549383B (en) | Real-time multi-sensor community robot navigation method | |
CN109948470B (en) | Hough transform-based parking line distance detection method and system | |
CN107505948B (en) | Attitude adjustment method for imaging along curve strip in agile satellite locomotive | |
US5762292A (en) | Apparatus for identification and tracking of objects | |
Zhang et al. | 3d lidar-based intersection recognition and road boundary detection method for unmanned ground vehicle | |
CN111044073A (en) | High-precision AGV position sensing method based on binocular laser | |
US11702089B2 (en) | Multi-sensor sequential calibration system | |
US11952038B2 (en) | Transverse steering method and transverse steering device for moving a vehicle into a target position, and vehicle for this purpose | |
WO2019208101A1 (en) | Position estimating device | |
DE69822187T2 (en) | METHOD AND DEVICE FOR LOCATING AND GUIDING A VEHICLE PROVIDED WITH A LINEAR CAMERA | |
CN114399748A (en) | Agricultural machinery real-time path correction method based on visual lane detection | |
JPH07120555A (en) | Environment recognition device for vehicle | |
CN110239636B (en) | Coordinate correction system and correction method of unmanned equipment | |
DE19818473C2 (en) | Method for determining the position of a vehicle | |
US5142659A (en) | Estimation of local surface geometry from relative range images for object recognition | |
Hanawa et al. | Development of a stereo vision system to assist the operation of agricultural tractors | |
CN116704019A (en) | Drilling and anchoring robot monocular vision positioning method based on anchor rod network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |