CN111241894A - Method for detecting lane line and vehicle-mounted equipment - Google Patents
Method for detecting lane line and vehicle-mounted equipment Download PDFInfo
- Publication number
- CN111241894A CN111241894A CN201811448712.4A CN201811448712A CN111241894A CN 111241894 A CN111241894 A CN 111241894A CN 201811448712 A CN201811448712 A CN 201811448712A CN 111241894 A CN111241894 A CN 111241894A
- Authority
- CN
- China
- Prior art keywords
- line
- edge line
- left edge
- double yellow
- straight line
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
Abstract
The invention provides a method for detecting lane lines and vehicle-mounted equipment, wherein the method comprises the following steps: acquiring a left side edge line from a gray scale image of a road, and judging whether the left side edge line is a straight line in a double yellow line; if so, outputting the straight line with the larger pole diameter in the double yellow lines as a lane line; and if not, outputting the left edge line as a lane line. Therefore, the invention can quickly and accurately detect the inner straight line according to the characteristics of the double yellow lines, prevent deviation early warning errors caused by jumping of detection results and ensure driving safety.
Description
Technical Field
The invention relates to the technical field of intelligent transportation and auxiliary driving, in particular to a method for detecting lane lines and vehicle-mounted equipment.
Background
An LDWS (lane departure warning system) is an important component in the field of vehicle safety driving assistance, and is a system for assisting a driver to reduce or even avoid lane departure accidents by alarming. The system acquires scene data constantly based on a monocular vision mode, obtains position parameters of lane lines in images through image processing, and sends out early warning signals to remind a driver of correcting unconscious lane departure when detecting that the automobile deviates from a lane, so that lane departure accidents are reduced.
The lane line detection is used as a core technology of a lane departure early warning system, and the accuracy of the detection result directly influences the effect of a subsequent early warning system. In the lane line detection process, if only one lane line is arranged on each of the two sides of the vehicle, two lane lines can be smoothly detected by detecting a straight line through Hough transform. However, if a double yellow line exists on one side of the vehicle, the inner straight line cannot be accurately detected because the characteristics of the two straight lines are very similar. And the double yellow lines are generally the boundary lines of the bidirectional lanes, and if the detection is incorrect, traffic accidents are likely to happen.
Disclosure of Invention
In view of this, the present invention provides a method for detecting lane lines and a vehicle-mounted device to solve the problem in the prior art that the inner straight lines of the double yellow lines cannot be accurately detected.
Specifically, the invention is realized by the following technical scheme:
the invention provides a method for detecting lane lines, which comprises the following steps:
acquiring a left side edge line from a gray scale image of a road;
judging whether the left side edge line is a straight line in the double yellow lines;
if so, outputting the straight line with the larger pole diameter in the double yellow lines as a lane line;
and if not, outputting the left edge line as a lane line.
As an embodiment, acquiring the left edge line from the grayscale map of the road includes:
selecting an interested area from a gray scale image of a road;
filtering and binarizing the region of interest to obtain a left side edge region;
and carrying out Hough transform on the left edge area to detect a left edge line.
As an embodiment, the determining whether the left edge line is a straight line in a double yellow line includes:
taking a straight line which is closest to the voting result of the left edge line in the neighborhood and the left edge line as candidate double yellow lines;
selecting coordinates of two points on any row of the candidate double yellow lines, and calculating the width between the candidate double yellow lines according to the coordinates of the two points;
if the width meets the preset condition, determining that the left side edge line is a straight line in the double yellow lines;
and if the width does not meet the preset condition, determining that the left edge line is not a straight line in the double yellow lines.
As an embodiment, taking the straight line closest to the voting result of the left edge line in the neighborhood and the left edge line as candidate double yellow lines includes:
obtaining the voting number of the left side edge line;
acquiring a target straight line which is closest to the vote number of the left edge line from the neighborhood;
and selecting the target straight line and the left edge line as candidate double yellow lines, wherein the difference between the target straight line and the left edge line in the pole diameter is smaller than a first threshold value, and the difference between the pole angles is smaller than a second threshold value.
As one embodiment, a method of determining the neighborhood includes:
calculating a Hough matrix containing the left edge line;
and determining a region adjacent to the position of the left edge line in the Hough matrix as the neighborhood.
Based on the same conception, the invention also provides vehicle-mounted equipment, which comprises a memory, a processor, a communication interface and a communication bus;
the memory, the processor and the communication interface are communicated with each other through the communication bus;
the memory is used for storing a computer program;
the processor is configured to execute the computer program stored in the memory, and when the processor executes the computer program, any step of the method for detecting a lane line is implemented.
Based on the same concept, the present invention also provides a computer-readable storage medium having a computer program stored therein, which, when executed by a processor, implements any one of the steps of the method of detecting lane lines.
Therefore, the left side edge line can be obtained from the gray scale image of the road, and if the left side edge line is determined to be a straight line in the double yellow lines, a straight line with a larger pole diameter can be selected from the double yellow lines to be used as the lane line for output; otherwise, the left edge line is considered as a single yellow line, and therefore the left edge line is directly output as the lane line. Compared with the prior art, the method can quickly and accurately detect the inner straight line according to the characteristics of the double yellow lines, prevent deviation early warning errors caused by jumping of detection results, and guarantee driving safety.
Drawings
FIG. 1 is a process flow diagram of a method of detecting lane markings in an exemplary embodiment of the invention;
FIG. 2 is a schematic diagram of lane detection in the prior art;
FIG. 3 is a schematic view of a region of interest in an exemplary embodiment of the invention;
FIG. 4 is a schematic diagram of coordinates in an exemplary embodiment of the invention;
FIG. 5 is a schematic view of lane detection in an exemplary embodiment of the invention;
FIG. 6 is a logical block diagram of a computer device in an exemplary embodiment of the invention.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
In order to solve the problems in the prior art, the invention provides a method for detecting a lane line and vehicle-mounted equipment, which can acquire a left edge line from a gray scale image of a road, and can select a straight line with a larger polar diameter from double yellow lines to be output as the lane line if the left edge line is determined to be a straight line in the double yellow lines; otherwise, the left edge line is considered as a single yellow line, and therefore the left edge line is directly output as the lane line. Compared with the prior art, the method can quickly and accurately detect the inner straight line in the double yellow lines according to the characteristics of the double yellow lines, prevent deviation early warning errors caused by jumping of detection results, and guarantee driving safety.
Referring to fig. 1, a flowchart of a method for detecting lane lines according to an exemplary embodiment of the invention is shown, where the method includes:
in this embodiment, a grayscale image of a road may be acquired by an optical device, and then the grayscale image is subjected to image processing to obtain a left edge line in the grayscale image.
Specifically, a region of interest may first be selected from the gray scale map. An optical device usually acquires a section of road video, and then processes each frame of gray level image, since one image includes not only road information, but also roadside buildings, sky, and the like, lane line detection only needs to analyze the road information, and other areas belong to invalid areas, and if the whole image is processed, many useless calculations are added, so that in this embodiment, an area of interest (ROI) is pre-defined, as shown in fig. 2, only the road information is contained in the area of interest, and all other useless information are deleted, thereby reducing the calculation amount.
Secondly, filtering and binarization processing can be carried out on the region of interest to obtain a left edge region. For example, the [ -1, 0, 1] template is adopted to filter the image in the region of interest, only the left edge region of the lane line is reserved, and then binarization processing is performed on the left edge region, that is, pixels smaller than 0 are set to be 0, and pixels larger than 255 are set to be 255, so that the whole image presents an obvious visual effect only including black and white. Since the lane lines have a certain width, for the convenience of calculation, hough transform may be further performed on the binarized image, so as to detect a left edge line in the left edge region.
102, judging whether the left side edge line is a straight line in a double yellow line; if yes, go to step 103; if not, go to step 104;
if the lane lines on the driving road of the vehicle are double yellow lines which are very similar in color, texture, geometry and other characteristics, one of the straight lines can be detected by using the traditional Hough transform straight line detection method, but the result is random, namely the straight line can be the inner side or the outer side, and when the outer straight line is detected, as shown in FIG. 3, the area marked by the left black line is the outer straight line of the double yellow lines, which is extremely dangerous for lane departure warning because the double yellow lines are often the boundary lines of two-way lanes, and an accident can be triggered if the lane departure information is inaccurate.
When the vehicle runs on the road, only the left lane line may be a double yellow line, so that only the left edge line detected by hough needs to be rechecked, and whether the left edge line is a straight line in the double yellow lines can be judged.
As an embodiment, a straight line closest to the voting result of the left edge line in the neighborhood and the left edge line may be taken as candidate double yellow lines; specifically, the number of votes of the left edge line can be acquired; when a left side edge line is calculated through Hough transformation, calculating to obtain a Hough matrix containing the left side edge line, determining a preset region adjacent to the position of the left side edge line in the Hough matrix as a neighborhood, and acquiring a target straight line which is closest to the vote number of the left side edge line from the neighborhood; and then selecting the target straight line and the left edge line as candidate double yellow lines, wherein the difference between the target straight line and the left edge line in the pole diameter is smaller than a first threshold value, and the difference between the pole angles is smaller than a second threshold value.
For example, assume that the left edge line of Hough detection results in (ρ)1,θ1) Where ρ is the pole diameter and θ is the pole angle. Searching a smaller neighborhood of the left edge line in the Hough matrix, and searching a certain position (rho)2,θ2) When the number of votes is closest to the current number of votes, the difference between the polar diameters and the difference between the polar angles of the two straight lines are compared, and when | ρ |1-ρ2|<T1And | θ1-θ2|<T2Then, the two straight lines may be determined as the candidate double yellow lines.
After determining that the two straight lines are the candidate double yellow lines, further selecting coordinates of two points on any line of the candidate double yellow lines, and calculating the width between the candidate double yellow lines according to the coordinates of the two points; if the width meets the preset condition, determining that the left side edge line is a straight line in the double yellow lines; and if the width does not meet the preset condition, determining that the left edge line is not a straight line in the double yellow lines.
For example, a candidate double yellow line can be drawn in a binary image, and then the coordinates (u) of two points on a certain line of the image are selected1V) and (u)2V), since the lane lines themselves are parallel, the true width is consistent no matter which row of points is selected from the candidate double yellow lines, and the graph is shown in fig. 4. By the principle of perspective transformationThe following can be obtained:
wherein xI=(u-u0)·dx,yI=(v-v0)·dy,ycIs the camera height.
From equation (one), we can obtain:
after transformation, we can get:
obtained from ρ ═ x cos θ + y sin θ:
substituting equation (four) into equation (three) yields:
then
The result of equation (six) represents the width between the two candidate double yellow lines.
When the two candidate double yellow lines meet the preset condition, namely T3<xc1-xc2<T4If the two candidate double yellow lines can be determined to be real double yellow lines, the left edge line is a straight line in the double yellow lines. If the calculated width does not meet the preset condition, a straight line close to the voting of the left edge line needs to be searched again, and the judgment of the double yellow lines is repeated. And if the requirements of double yellow lines are not met after all the straight lines are traversed, the left edge line is considered not to be a straight line in the double yellow lines, and the current lane line is possibly a single yellow line.
103, outputting the straight line with the larger pole diameter in the double yellow lines as a lane line;
when the left edge line is determined to be a straight line in the double yellow lines, an inner straight line in the double yellow lines can be further selected for output. Because the pole diameter rho of the straight line at the inner side of the double yellow lines is larger than that of the straight line at the outer side, the straight line with the larger pole diameter rho can be selected as the lane line for output.
The final output result is shown in fig. 5, where the position of the left black line mark is the position of the inner straight line of the double yellow lines.
And 104, outputting the left edge line as a lane line.
If the left edge line is not considered to be a straight line in the double yellow lines, it indicates that the current lane is not a double yellow line, and the left edge line may be a single yellow line, so that the left edge line may be output as a lane line.
Compared with the prior art, the method has the advantages that the characteristics of similarity and fixed distance of the characteristics of the two straight lines of the double yellow lines are utilized, the double yellow lines can be detected by searching the vote number and the positions of rho and theta corresponding to each straight line, whether the two straight lines are the double yellow lines is determined by judging whether the actual widths of the two straight lines meet the requirement of the double yellow lines, and finally the straight line with larger rho in the double yellow lines is selected as the straight line inside the double yellow lines to be output. Therefore, the method and the device can quickly and accurately detect the inner straight lines of the double yellow lines, and can timely early warn dangerous behaviors of crossing the double yellow lines on a road, so that the driving safety of a driver is greatly protected.
Based on the same concept, the present invention also provides an in-vehicle apparatus, as shown in fig. 6, including a memory 61, a processor 62, a communication interface 63, and a communication bus 64;
wherein, the memory 61, the processor 62 and the communication interface 63 communicate with each other through the communication bus 64;
the memory 61 is used for storing computer programs;
the processor 62 is configured to execute the computer program stored in the memory 61, and when the processor 62 executes the computer program, any step of the method for detecting a lane line according to the embodiment of the present invention is implemented.
The present invention also provides a computer-readable storage medium, in which a computer program is stored, and when the computer program is executed by a processor, the computer program implements any step of the method for detecting a lane line provided in the embodiment of the present invention.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for embodiments of the computer device and the computer-readable storage medium, since they are substantially similar to the method embodiments, the description is relatively simple, and in relation to what is described in the partial description of the method embodiments.
In summary, the left side edge line can be obtained from the gray scale image of the road, and if the left side edge line is determined to be a straight line in the double yellow lines, a straight line with a larger pole diameter can be selected from the double yellow lines to be output as the lane line; otherwise, the left edge line is considered as a single yellow line, and therefore the left edge line is directly output as the lane line. Compared with the prior art, the method can quickly and accurately detect the inner straight line according to the characteristics of the double yellow lines, prevent deviation early warning errors caused by jumping of detection results, and guarantee driving safety.
The implementation process of the functions and actions of each unit in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.
Claims (7)
1. A method of detecting a lane line, the method comprising:
acquiring a left side edge line from a gray scale image of a road;
judging whether the left side edge line is a straight line in the double yellow lines;
if so, outputting the straight line with the larger pole diameter in the double yellow lines as a lane line;
and if not, outputting the left edge line as a lane line.
2. The method of claim 1, wherein obtaining the left edge line from the gray scale map of the road comprises:
selecting an interested area from a gray scale image of a road;
filtering and binarizing the region of interest to obtain a left side edge region;
and carrying out Hough transform on the left edge area to detect a left edge line.
3. The method of claim 1, wherein determining whether the left edge line is a straight line in a double yellow line comprises:
taking a straight line which is closest to the voting result of the left edge line in the neighborhood and the left edge line as candidate double yellow lines;
selecting coordinates of two points on any row of the candidate double yellow lines, and calculating the width between the candidate double yellow lines according to the coordinates of the two points;
if the width meets the preset condition, determining that the left side edge line is a straight line in the double yellow lines;
and if the width does not meet the preset condition, determining that the left edge line is not a straight line in the double yellow lines.
4. The method of claim 3, wherein the step of using the straight line closest to the voting result of the left edge line in the neighborhood and the left edge line as candidate double yellow lines comprises:
obtaining the voting number of the left side edge line;
acquiring a target straight line which is closest to the vote number of the left edge line from the neighborhood;
and selecting the target straight line and the left edge line as candidate double yellow lines, wherein the difference between the target straight line and the left edge line in the pole diameter is smaller than a first threshold value, and the difference between the pole angles is smaller than a second threshold value.
5. The method of claim 3 or 4, wherein determining the neighborhood comprises:
calculating a Hough matrix containing the left edge line;
and determining a region adjacent to the position of the left edge line in the Hough matrix as the neighborhood.
6. An in-vehicle device, characterized in that the in-vehicle device comprises a memory, a processor, a communication interface and a communication bus;
the memory, the processor and the communication interface are communicated with each other through the communication bus;
the memory is used for storing a computer program;
the processor for executing the computer program stored on the memory, the processor implementing the method according to any of claims 1-5 when executing the computer program.
7. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811448712.4A CN111241894B (en) | 2018-11-28 | 2018-11-28 | Method for detecting lane line and vehicle-mounted equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811448712.4A CN111241894B (en) | 2018-11-28 | 2018-11-28 | Method for detecting lane line and vehicle-mounted equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111241894A true CN111241894A (en) | 2020-06-05 |
CN111241894B CN111241894B (en) | 2023-06-27 |
Family
ID=70875914
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811448712.4A Active CN111241894B (en) | 2018-11-28 | 2018-11-28 | Method for detecting lane line and vehicle-mounted equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111241894B (en) |
Citations (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1186199A (en) * | 1997-09-10 | 1999-03-30 | Yazaki Corp | Method and device for detecting traffic lane |
US20090174577A1 (en) * | 2007-11-29 | 2009-07-09 | Aisin Aw Co., Ltd. | Image recognition apparatuses, methods and programs |
CN102332209A (en) * | 2011-02-28 | 2012-01-25 | 王志清 | Automobile violation video monitoring method |
CN102521589A (en) * | 2011-11-18 | 2012-06-27 | 深圳市宝捷信科技有限公司 | Method and system for detecting lane marked lines |
CN103903019A (en) * | 2014-04-11 | 2014-07-02 | 北京工业大学 | Automatic generating method for multi-lane vehicle track space-time diagram |
CN104794855A (en) * | 2014-01-22 | 2015-07-22 | 径卫视觉科技(上海)有限公司 | Driver's attention comprehensive assessment system |
CN105005771A (en) * | 2015-07-13 | 2015-10-28 | 西安理工大学 | Method for detecting full line of lane based on optical flow point locus statistics |
CN105279979A (en) * | 2015-10-15 | 2016-01-27 | 华南理工大学 | Variable lane driving direction switching method based on vehicle position on tracking lane |
CN105426863A (en) * | 2015-11-30 | 2016-03-23 | 奇瑞汽车股份有限公司 | Method and device for detecting lane line |
CN105447892A (en) * | 2015-11-05 | 2016-03-30 | 奇瑞汽车股份有限公司 | Method and device for determining yaw angle of vehicle |
CN105460009A (en) * | 2015-11-30 | 2016-04-06 | 奇瑞汽车股份有限公司 | Automobile control method and device |
CN105718916A (en) * | 2016-01-27 | 2016-06-29 | 大连楼兰科技股份有限公司 | Lane line detection method based on Hough transform |
CN205451514U (en) * | 2016-01-27 | 2016-08-10 | 王德龙 | Car real -time road conditions over --horizon radar of navigation and network alarm system |
CN106504164A (en) * | 2016-10-19 | 2017-03-15 | 东南大学 | A kind of division methods of combination area of city and country's inferior grade road speeds control zone |
CN106529505A (en) * | 2016-12-05 | 2017-03-22 | 惠州华阳通用电子有限公司 | Image-vision-based lane line detection method |
US9639804B1 (en) * | 2016-03-22 | 2017-05-02 | Smartdrive Systems, Inc. | System and method to determine responsiveness of a driver of a vehicle to feedback regarding driving behaviors |
US20170316333A1 (en) * | 2015-11-04 | 2017-11-02 | Zoox, Inc. | Automated extraction of semantic information to enhance incremental mapping modifications for robotic vehicles |
CN107437071A (en) * | 2017-07-18 | 2017-12-05 | 杭州岱石科技有限公司 | A kind of robot autonomous method for inspecting based on double amber lines detection |
US20180004214A1 (en) * | 2017-09-15 | 2018-01-04 | GM Global Technology Operations LLC | Vehicle remote assistance mode |
CN107643086A (en) * | 2016-07-22 | 2018-01-30 | 北京四维图新科技股份有限公司 | A kind of vehicle positioning method, apparatus and system |
CN108090425A (en) * | 2017-12-06 | 2018-05-29 | 海信集团有限公司 | A kind of method for detecting lane lines, device and terminal |
CN108694830A (en) * | 2018-05-15 | 2018-10-23 | 何冠男 | The radar recognition methods of road traffic mark |
-
2018
- 2018-11-28 CN CN201811448712.4A patent/CN111241894B/en active Active
Patent Citations (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1186199A (en) * | 1997-09-10 | 1999-03-30 | Yazaki Corp | Method and device for detecting traffic lane |
US20090174577A1 (en) * | 2007-11-29 | 2009-07-09 | Aisin Aw Co., Ltd. | Image recognition apparatuses, methods and programs |
CN102332209A (en) * | 2011-02-28 | 2012-01-25 | 王志清 | Automobile violation video monitoring method |
CN102521589A (en) * | 2011-11-18 | 2012-06-27 | 深圳市宝捷信科技有限公司 | Method and system for detecting lane marked lines |
CN104794855A (en) * | 2014-01-22 | 2015-07-22 | 径卫视觉科技(上海)有限公司 | Driver's attention comprehensive assessment system |
CN103903019A (en) * | 2014-04-11 | 2014-07-02 | 北京工业大学 | Automatic generating method for multi-lane vehicle track space-time diagram |
CN105005771A (en) * | 2015-07-13 | 2015-10-28 | 西安理工大学 | Method for detecting full line of lane based on optical flow point locus statistics |
CN105279979A (en) * | 2015-10-15 | 2016-01-27 | 华南理工大学 | Variable lane driving direction switching method based on vehicle position on tracking lane |
US20170316333A1 (en) * | 2015-11-04 | 2017-11-02 | Zoox, Inc. | Automated extraction of semantic information to enhance incremental mapping modifications for robotic vehicles |
CN105447892A (en) * | 2015-11-05 | 2016-03-30 | 奇瑞汽车股份有限公司 | Method and device for determining yaw angle of vehicle |
CN105460009A (en) * | 2015-11-30 | 2016-04-06 | 奇瑞汽车股份有限公司 | Automobile control method and device |
CN105426863A (en) * | 2015-11-30 | 2016-03-23 | 奇瑞汽车股份有限公司 | Method and device for detecting lane line |
CN105718916A (en) * | 2016-01-27 | 2016-06-29 | 大连楼兰科技股份有限公司 | Lane line detection method based on Hough transform |
CN205451514U (en) * | 2016-01-27 | 2016-08-10 | 王德龙 | Car real -time road conditions over --horizon radar of navigation and network alarm system |
US9639804B1 (en) * | 2016-03-22 | 2017-05-02 | Smartdrive Systems, Inc. | System and method to determine responsiveness of a driver of a vehicle to feedback regarding driving behaviors |
CN107643086A (en) * | 2016-07-22 | 2018-01-30 | 北京四维图新科技股份有限公司 | A kind of vehicle positioning method, apparatus and system |
CN106504164A (en) * | 2016-10-19 | 2017-03-15 | 东南大学 | A kind of division methods of combination area of city and country's inferior grade road speeds control zone |
CN106529505A (en) * | 2016-12-05 | 2017-03-22 | 惠州华阳通用电子有限公司 | Image-vision-based lane line detection method |
CN107437071A (en) * | 2017-07-18 | 2017-12-05 | 杭州岱石科技有限公司 | A kind of robot autonomous method for inspecting based on double amber lines detection |
US20180004214A1 (en) * | 2017-09-15 | 2018-01-04 | GM Global Technology Operations LLC | Vehicle remote assistance mode |
CN108090425A (en) * | 2017-12-06 | 2018-05-29 | 海信集团有限公司 | A kind of method for detecting lane lines, device and terminal |
CN108694830A (en) * | 2018-05-15 | 2018-10-23 | 何冠男 | The radar recognition methods of road traffic mark |
Also Published As
Publication number | Publication date |
---|---|
CN111241894B (en) | 2023-06-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110689761B (en) | Automatic parking method | |
CN110443225B (en) | Virtual and real lane line identification method and device based on feature pixel statistics | |
JP6347815B2 (en) | Method, apparatus and device for detecting lane boundaries | |
CN109284674B (en) | Method and device for determining lane line | |
CN109583267B (en) | Vehicle target detection method, vehicle target detection device, and vehicle | |
CN108629292B (en) | Curved lane line detection method and device and terminal | |
CN108229406B (en) | Lane line detection method, device and terminal | |
CN107392139B (en) | Lane line detection method based on Hough transform and terminal equipment | |
CN109871787B (en) | Obstacle detection method and device | |
CN110738081B (en) | Abnormal road condition detection method and device | |
CN107609483A (en) | Risk object detection method, device towards drive assist system | |
CN109635737A (en) | Automobile navigation localization method is assisted based on pavement marker line visual identity | |
CN112654998B (en) | Lane line detection method and device | |
CN112001235A (en) | Vehicle traffic information generation method and device and computer equipment | |
CN113256739A (en) | Self-calibration method and device for vehicle-mounted BSD camera and storage medium | |
CN107918775B (en) | Zebra crossing detection method and system for assisting safe driving of vehicle | |
CN111089598B (en) | Vehicle-mounted lane-level real-time map matching method based on ICCIU | |
CN110991264A (en) | Front vehicle detection method and device | |
CN114841910A (en) | Vehicle-mounted lens shielding identification method and device | |
CN115372990A (en) | High-precision semantic map building method and device and unmanned vehicle | |
CN107220632B (en) | Road surface image segmentation method based on normal characteristic | |
CN114543819B (en) | Vehicle positioning method, device, electronic equipment and storage medium | |
CN114663859A (en) | Sensitive and accurate complex road condition lane deviation real-time early warning system | |
CN112183206B (en) | Traffic participant positioning method and system based on road side monocular camera | |
CN108090425B (en) | Lane line detection method, device and terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |