CN110770741A - Lane line identification method and device and vehicle - Google Patents

Lane line identification method and device and vehicle Download PDF

Info

Publication number
CN110770741A
CN110770741A CN201880039256.XA CN201880039256A CN110770741A CN 110770741 A CN110770741 A CN 110770741A CN 201880039256 A CN201880039256 A CN 201880039256A CN 110770741 A CN110770741 A CN 110770741A
Authority
CN
China
Prior art keywords
line segment
line
segment
segments
score
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201880039256.XA
Other languages
Chinese (zh)
Other versions
CN110770741B (en
Inventor
崔健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Zhuoyu Technology Co ltd
Original Assignee
Shenzhen Dajiang Innovations Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Dajiang Innovations Technology Co Ltd filed Critical Shenzhen Dajiang Innovations Technology Co Ltd
Publication of CN110770741A publication Critical patent/CN110770741A/en
Application granted granted Critical
Publication of CN110770741B publication Critical patent/CN110770741B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Multimedia (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)

Abstract

Provided are a lane line identification method and device and a vehicle. The lane line identification method comprises the following steps: identifying all line segments in the road image; for each line segment, determining a relevant line segment of the line segment from other line segments, and calculating the contribution degree of the relevant line segment of the line segment to the line segment, wherein the contribution degree is used for representing the influence degree of the relevant line segment on the line segment as a lane line; calculating the score of each line segment according to the length of the line segment and the contribution degree of the associated line segment of the line segment to the line segment, wherein the score is used for representing the possibility that the line segment is a lane line; and determining the line segment with the highest score and the associated line segment of the line segment with the highest score as the lane line. The method comprehensively considers the length of each line segment in the road image and the contribution degree of the associated line segment of the line segment to the line segment, judges the possibility of the line segment as the lane line, and enables the detected lane line result to conform to the actual lane line as much as possible through combination optimization, thereby improving the robustness of lane line detection.

Description

Lane line identification method and device and vehicle
Technical Field
The invention relates to the technical field of image processing, in particular to a lane line identification method and device and a vehicle.
Background
In the related art, a lane line in a road image is detected by a feature extraction method, a straight line detection method, or a curve detection method. However, in an actual scene, many things other than the lane lines are also shaped like lines, and many things other than the lane lines in the road image, such as guard rails on both sides of the road, linear markers (such as arrow characters) on the road surface markers, edges of vehicles or pedestrians on the road surface, etc., are recognized as shapes like lines by the above algorithm, and are recognized as lane lines, and it is apparent that the error rate of the lane lines recognized based on the above algorithm is high.
Disclosure of Invention
The invention provides a lane line identification method and device and a vehicle.
Specifically, the invention is realized by the following technical scheme:
according to a first aspect of the present invention, there is provided a lane line identification method, the method comprising:
identifying all line segments in the road image;
for each line segment, determining a relevant line segment of the line segment from other line segments, and calculating the contribution degree of the relevant line segment of the line segment to the line segment, wherein the contribution degree is used for representing the influence degree of the relevant line segment on the line segment as a lane line;
calculating the score of each line segment according to the length of the line segment and the contribution degree of the associated line segment of the line segment to the line segment, wherein the score is used for representing the possibility that the line segment is a lane line;
and determining the line segment with the highest score and the associated line segment of the line segment with the highest score as the lane line.
According to a second aspect of the present invention, there is provided a lane line identification device comprising:
storage means for storing program instructions;
a processor calling program instructions stored in the storage device, the program instructions when executed operable to:
identifying all line segments in the road image;
for each line segment, determining a relevant line segment of the line segment from other line segments, and calculating the contribution degree of the relevant line segment of the line segment to the line segment, wherein the contribution degree is used for representing the influence degree of the relevant line segment on the line segment as a lane line;
calculating the score of each line segment according to the length of the line segment and the contribution degree of the associated line segment of the line segment to the line segment, wherein the score is used for representing the possibility that the line segment is a lane line;
and determining the line segment with the highest score and the associated line segment of the line segment with the highest score as the lane line.
According to a third aspect of the present invention, there is provided a vehicle comprising:
a vehicle body;
a camera fixed on the vehicle body; and
the processor is electrically connected with the shooting device;
the shooting device is used for shooting road images in front of the vehicle and sending the road images to the processor, and the processor is used for:
identifying all line segments in a road image
For each line segment, determining a relevant line segment of the line segment from other line segments, and calculating the contribution degree of the relevant line segment of the line segment to the line segment, wherein the contribution degree is used for representing the influence degree of the relevant line segment on the line segment as a lane line;
calculating the score of each line segment according to the length of the line segment and the contribution degree of the associated line segment of the line segment to the line segment, wherein the score is used for representing the possibility that the line segment is a lane line;
and determining the line segment with the highest score and the associated line segment of the line segment with the highest score as the lane line.
According to a fourth aspect of the present invention, there is provided a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of:
identifying all line segments in the road image;
for each line segment, determining a relevant line segment of the line segment from other line segments, and calculating the contribution degree of the relevant line segment of the line segment to the line segment, wherein the contribution degree is used for representing the influence degree of the relevant line segment on the line segment as a lane line;
calculating the score of each line segment according to the length of the line segment and the contribution degree of the associated line segment of the line segment to the line segment, wherein the score is used for representing the possibility that the line segment is a lane line;
and determining the line segment with the highest score and the associated line segment of the line segment with the highest score as the lane line.
According to the technical scheme provided by the embodiment of the invention, the embodiment of the invention comprehensively considers the length of each line segment in the road image and the contribution degree of the associated line segment of the line segment to judge the possibility of the line segment as the lane line, and the detected lane line result is made to conform to the actual lane line as much as possible through combination optimization, so that the robustness of lane line detection is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is a flowchart of a method of lane line identification according to an embodiment of the present invention;
fig. 2 is a flowchart of a specific implementation manner of the lane line identification method shown in fig. 1 according to an embodiment of the present invention;
fig. 3 is a flowchart of another specific implementation manner of the lane line identification method shown in fig. 1 according to an embodiment of the present invention;
fig. 4 is a flowchart of a specific implementation manner of the lane line identification method shown in fig. 3 according to an embodiment of the present invention;
fig. 5 is a block diagram of a lane line recognition apparatus according to an embodiment of the present invention;
fig. 6 is a block diagram of a vehicle according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The following describes a lane line recognition method and apparatus, and a vehicle in detail, with reference to the drawings. The features of the following examples and embodiments may be combined with each other without conflict.
Example one
Fig. 1 is a flowchart of a method for identifying a lane line according to an embodiment of the present invention. Referring to fig. 1, the lane line identification method may include the steps of:
step S101: identifying all line segments in the road image;
in practice, the lane line may include a straight line and a curved line, so that both the straight line segment and the curved line segment in the road image are considered as the suspected lane line. By executing step S101, the present embodiment may identify all straight line segments and/or curved line segments in the road image, that is, the line segments of the present embodiment may include straight line segments and/or curved line segments.
In this embodiment, all the line segments in the road image are segmented first, and then all the line segments are identified based on a line segment detection algorithm. The manner of dividing all the line segment regions in the road image may be selected as needed, for example, in some examples, all the line segment regions in the road image are divided based on a CNN (Convolutional Neural Network). Optionally, all segment regions in the road image are segmented based on the CNN semantic meaning. Optionally, a deep learning algorithm is used to train a large number of road image samples to obtain a lane line model, and the current road image is input into the lane line model to obtain all line segment regions (including a straight line segment region and/or a curve segment region) in the current road image.
In other examples, all the line segment regions in the road image are segmented based on an edge detection algorithm. Specifically, the edges of all line segments in the road image are detected based on an edge detection algorithm, so that all line segment areas in the road image are segmented.
The line segment detection algorithm of this embodiment may be a hough transform algorithm (hough transform), or may be other line segment detection algorithms, and specifically, the type of the line segment detection algorithm may be selected as needed. In the present embodiment, based on the line segment detection algorithm, some parameter information of all line segments in the road image can be identified, for example, lengths of all line segments, and positional relationships between all line segments (such as included angles between all line segments and/or spaces between all line segments).
The lane line recognition method of the embodiment can be applied to vehicles, particularly unmanned vehicles, and road images in front of the vehicles can be captured by a camera on the vehicles, and the road images are generally in front view. In the front view, lane markers such as road surface marker arrows and lane lines may be distorted, the distorted shape is related to the position of the vehicle, the line segments farther away from the vehicle in the front view are difficult to recognize, and the same lane markers in the front view have poor consistency and are difficult to recognize accurately. In order to improve the accuracy of the lane line identification, the embodiment performs image rectification on the road image before identifying all line segments in the road image. The image rectification method can be selected as required, and in one embodiment, the road image is projected to the corresponding top view based on the inverse perspective transformation. In the embodiment, the road image is projected to the top view, so that the road surface markers such as lane lines and arrows can be restored to real dimensions and properties, and the road surface markers in the top view are easier to identify; in addition, the positions of the pixel points on the road surface in the top view directly correspond to the real positions, and the position relation between a certain pixel point and a vehicle can be directly obtained according to the top view, so that the requirements of a basic ADAS function and an automatic driving function can be met.
Specifically, based on the inverse perspective transformation, the projecting the road image to the corresponding top view may include the following steps:
(1) calibrating internal parameters of a shooting device and external parameters of the shooting device to the ground;
wherein, the internal reference of the shooting device
Figure BDA0002315516700000051
fx、fyCharacterizing the focal length of the camera, cx、 cyThe optical axis of the lens of the characterization shooting device passes through the position of the imaging sensor. The calibration of the camera's internal parameters may use existing calibration algorithms, which will not be described in detail here.
The external parameters of the shooting device to the ground comprise a rotation matrix R and a translation vector T, which are respectively rotation and translation of the shooting device relative to an object plane, wherein the object plane is a plane where a lane line is located in the embodiment. Wherein, T can be converted to the ground height through the shooting device. The calibration of R is realized by indirectly calibrating a ground pitch angle pitch of the shooting device (the ground when the shooting device shoots the current road image), a ground roll angle roll of the shooting device and a yaw angle yaw of the shooting device right ahead of the vehicle, wherein the pitch, roll and yaw are respectively rotation angles of the shooting device to self coordinate axes x, y and z and are respectively rotation angles of the shooting device to the self coordinate axes x, y and z
Figure BDA0002315516700000052
Theta and phi, and rotation matrixes respectively corresponding to the three axes can be calculated according to the 3 angles
Figure BDA0002315516700000053
Ry(θ)、RzAnd (phi), and then calculating R according to the rotation matrixes respectively corresponding to the three axes.
In the present embodiment, the first and second electrodes are,
Figure BDA0002315516700000054
Figure BDA0002315516700000056
Figure BDA0002315516700000057
(2) calculating a projection matrix H for mapping pixel points in the road image to the top view;
mapping points of the object plane coordinate system onto the image coordinate system can be expressed as:
Figure BDA0002315516700000061
wherein u and v are pixel points of a road image coordinate system;
s is a normalization coefficient;
m is the internal reference of the shooting device;
[r1r2r3t]the external parameter of the shooting device to the object plane, namely the position relation;
r1、r2、r3is a 3 by 1 column vector, r1、r2、r3Forming a rotation matrix R;
t is a 3 by 1 column vector representing the translation of the camera to the object plane;
x, Y denote coordinates in the plane of the object.
Assuming that the object is in one plane and Z is zero, equation (1) can be expressed as:
Figure BDA0002315516700000062
H=sM[r1r2t](3);
(3) the road image is projected to the top view according to the projection matrix H.
Substituting equation (3) into equation (2) yields:
Figure BDA0002315516700000063
if X, Y object plane coordinates are taken as the ground coordinates, then equation (4) becomes the inverse of the ground-to-road image mapping multiplied by H on both sides:
Figure BDA0002315516700000064
and (4) substituting the coordinates of each pixel point on the road image into the formula (5), so as to obtain the coordinates of each pixel point in the top view.
The points of the object plane are projected to the top view by using the inverse perspective transformation algorithm, and only the points on the object plane can be accurately processed, and the projection of the points on the non-object plane has errors, for example, the guardrail bar on the guardrail is very close to a roadside lane line on a real top view angle, but the projection of the points on the guardrail to the top view by using the inverse perspective transformation algorithm is inaccurate, so that the guardrail is falsely detected as the lane line. In the embodiment, the road image is projected to the top view by using the inverse perspective transformation algorithm, then the lane line detection is performed on the top view, and then the real lane line in the top view and the false lane line in the top view can be more easily distinguished by using the lane line detection method under the prior condition.
In this embodiment, all line segments in the top view are identified, specifically, all line segment areas in the top view are segmented first, and then all line segments are identified based on a line segment detection algorithm.
Step S102: for each line segment, determining the associated line segment of the line segment from other line segments, and calculating the contribution degree of the associated line segment of the line segment to the line segment, wherein the contribution degree is used for representing the influence degree of the associated line segment on the line segment as a lane line;
specifically, referring to fig. 2, the associated line segment of the line segment is determined from the other line segments according to the position relationship between the line segment and the other line segments and the preset lane line prior condition. Optionally, the position relationship between the line segment and the other line segments includes an included angle between the line segment and the other line segments, or/and a distance between the line segment and the other line segments, and of course, the position relationship between the line segment and the other line segments may also be set according to other position relationships between the real lane lines. In one embodiment, the position relationship between the line segment and the other line segments includes an included angle between the line segment and the other line segments. In another embodiment, the positional relationship between the line segment and the other line segments includes a spacing between the line segment and the other line segments. In another embodiment, the positional relationship between the line segment and other line segments includes the included angle between the line segment and other line segments and the distance between the line segment and other line segments.
The present embodiment further describes the position relationship between the line segment and other line segments, including the included angle between the line segment and other line segments, and the distance between the line segment and other line segments.
The lane lines on the actual road are as parallel as possible, and the spacing between adjacent lane lines is in the range of about 2.5 meters to 4.2 meters. However, due to the influence of the shooting angle and other factors, a smaller included angle generally exists between the lane lines in the road image, the smaller the included angle between two line segments in the road image is, which indicates that the two line segments tend to be in a parallel position relationship on the actual road, the higher the possibility that the two line segments are the lane lines is, the preset prior condition of the lane line in the embodiment includes that the included angle between the lane lines is within a preset included angle range, wherein the size of the preset included angle range can be set according to the condition that the lane lines are parallel as much as possible.
Further, the preset lane line prior condition further includes that the distance between lane lines is an integral multiple of the preset distance. The preset distance is usually a value or a range of values in which the distance between adjacent lane lines is reduced in an equal ratio, and after the preset distance is determined, the ratio of the distance between two line segments in the road image to the preset distance is an integer or closer to an integer, which indicates that the two line segments are more likely to be lane lines. In this embodiment, the preset distance may be set according to a condition that the distance between adjacent lane lines is approximately in a range of 2.5 m to 4.2 m.
And preliminarily screening the identified line segments through a preset lane line prior condition, and filtering line segments which are obviously not lane lines, such as line segments with arrows or vehicle edges or the distance between a guardrail and a normal lane line not conforming to the real distance, line segments which are not parallel to the normal lane line and the like.
In this step, if the current line segment is a straight line segment, the determined line segment associated with the current line segment is a straight line segment. If the current line segment is a curve segment, the determined associated line segment of the current line segment is the curve segment.
During processing, aiming at the straight line segment, the associated straight line segment of the straight line segment can be determined from other straight line segments directly according to the position relation between the straight line segment and other straight line segments and the preset prior condition of the lane line. For a curve segment, the curve segment is usually divided into a plurality of segments, each segment of the curve segment is similar to a straight line segment, and then the associated curve segment of each segment of the curve segment is determined, and the determination process is similar to the straight line segment, which is not described herein again. When processing a straight line segment, the straight line segment may be divided into a plurality of stages.
Referring to fig. 3, after determining the associated line segment of each line segment, for each line segment, the contribution degree of the associated line segment of the line segment to the line segment is calculated according to the position relationship between the line segment and the associated line segment of the line segment, the length of the associated line segment of the line segment, and the length of the line segment. The position relation between the line segment and the associated line segment of the line segment comprises an included angle between the line segment and the associated line segment of the line segment and a distance between the line segment and the associated line segment.
For each line segment, the specific steps of calculating the contribution degree of the associated line segment of the line segment to the line segment according to the position relationship between the line segment and the associated line segment of the line segment, the length of the associated line segment of the line segment, and the length of the line segment can be seen in fig. 4:
step S401: calculating the score of each associated line segment according to the length of the associated line segment and the contribution degree of other associated line segments of the line segment to the associated line segment;
step S402: and aiming at each line segment, calculating the contribution degree of each associated line segment to the line segment according to the score of each associated line segment, the included angle between each associated line segment and the line segment, the ratio of the distance between each associated line segment and the line segment to a preset distance and the length of the line segment.
In a possible implementation manner, when step S401 is executed, a left associated line segment located on the left side of the line segment and a right associated line segment located on the right side of the line segment are determined according to a position relationship between the line segment and the associated line segment of the line segment; then, for each left associated line segment of each line segment, calculating the score of the left associated line segment according to the length of the left associated line segment and the contribution degree of other left associated line segments of the left associated line segment to the left associated line segment, and for each right associated line segment of each line segment, calculating the score of the right associated line segment according to the length of the right associated line segment and the contribution degree of other right associated line segments of the right associated line segment to the right associated line segment. And calculating the score of the left associated line segment and the score of the right associated line segment can be executed simultaneously or sequentially.
It should be noted that, in the embodiment of the present invention, the other left associated line segment of the left associated line segment refers to an associated line segment located on the left side of the left associated line segment, and the other right associated line segment of the right associated line segment refers to an associated line segment located on the right side of the right associated line segment. For example, for an associated line segment identified in the road image: the line comprises a line segment 1, a line segment 2, a line segment 3, a line segment 4 and a line segment 5, wherein the line segment 1, the line segment 2, the line segment 3, the line segment 4 and the line segment 5 are sequentially arranged from left to right, taking the line segment 3 as an example, the left associated line segment of the line segment 3 comprises the line segment 1 and the line segment 2, the right associated line segment comprises the line segment 4 and the line segment 5, the line segment 1 has no left associated line segment, the left associated line segment of the line segment 2 comprises the line segment 1, the right associated line segment of the line segment 4 comprises the line segment 5, and. When step S402 is executed, for each line segment, calculating a left contribution degree of each left associated line segment to the line segment according to the score of each left associated line segment, an included angle between each left associated line segment and the line segment, a ratio of a distance between each left associated line segment and the line segment to a preset distance, and a length of the line segment; and calculating the right contribution degree of each right associated line segment to each line segment according to the score of each right associated line segment, the included angle between each right associated line segment and the line segment, the ratio of the distance between each right associated line segment and the line segment to a preset distance and the length of the line segment.
Specifically, for the ith line segment, the left contribution degree of the jth left associated line segment to the ith line segment
Figure BDA0002315516700000091
The calculation formula of (a) is as follows:
Figure BDA0002315516700000092
for the ith line segment, the left contribution degree of the jth left associated line segment to the ith line segment
Figure BDA0002315516700000093
The calculation formula of (a) is as follows:
Figure BDA0002315516700000094
wherein i and j are positive integers, i belongs to (1, n), j belongs to (1, n), and n is the number of line segments;
Liis the length of the ith line segment;
k1is a first predetermined coefficient, and k1>0; in this example, k1Can be set as desired, k1The larger the setting is, the larger the contribution degree is influenced and expanded by α;
α is the included angle between the ith line segment and the jth left line segment or jth right line segment;
delta is the ratio of the distance between the ith line segment and the jth left line segment or the jth right line segment to the preset distance;
k2is a second predetermined coefficient, and 0<k2<1; in this example, k2Can be set as desired, k2The smaller the setting is, the larger the contribution degree is influenced by delta;
Figure BDA0002315516700000101
the score of the jth left associated line segment;
is the score of the jth right-associated line segment.
It should be noted that, when there is no associated line segment left of the jth left associated line segment,
Figure BDA0002315516700000103
when there is no associated segment to the right of the jth right segment,
Figure BDA0002315516700000104
it will be appreciated that the above-described,
Figure BDA0002315516700000105
the calculation manner is not limited to the calculation formula of formula (6) listed in the present embodiment,the calculation method of (c) is also not limited to the calculation formula of formula (7) listed in the present embodiment.
Step S103: calculating the score of each line segment according to the length of each line segment and the contribution degree of the associated line segment of the line segment to the line segment, wherein the score is used for representing the possibility that the line segment is a lane line;
in this embodiment, the score of the ith line segment is Si,SiThe calculation formula of (a) is as follows:
Si=Li+CSi,N(8);
in the formula (8), LiIs the length of the ith line segment;
CSi,Nthe contribution degree of the Nth associated line segment of the ith line segment to the ith line segment.
It is understood that the ith line segment has a score of SiThe calculation method of (c) is not limited to the above equation (8).
To simplify the calculation process, step S103 is executed specifically for eachA line segment, the maximum value of the left contribution of the left associated line segment of the line segment to the line segment according to the length of the line segment
Figure BDA0002315516700000107
And the maximum value of the right contribution degree of the right associated line segment of the line segment to the line segment
Figure BDA0002315516700000108
Calculating the score of the line segment, wherein the maximum value
Figure BDA0002315516700000109
And maximum value
Figure BDA00023155167000001010
The score of the line segment can be calculated after different weighting coefficients are processed, the specific weighting coefficient is determined by a specific application scene, in the scheme, the weighting coefficients of the two are both 1, and the contribution degrees of the two to the line segment are equivalent. Optionally, the score of each line segment is the maximum value of the length of the line segment and the left contribution of the left associated line segment of the line segment to the line segment
Figure BDA00023155167000001011
And the maximum value of the right contribution degree of the right associated line segment of the line segment to the line segment
Figure BDA00023155167000001012
And (4) summing.
The above equation (8) is simplified as:
Figure BDA00023155167000001013
wherein the content of the first and second substances,
Figure BDA0002315516700000111
the calculation method can be referred to the above step S102, and is not described herein again.
The score of the ith line segment can be calculated through the formula (9).
For example, for a line segment 3 identified by a road image, the associated line segments of the line segment 3 are determined to be a line segment 1, a line segment 2, a line segment 4 and a line segment 5, wherein the line segment 1 and the line segment 2 are located on the left side of the line segment 3, and the line segment 4 and the line segment 5 are located on the left side of the line segment 3, and when the score of the line segment 3 is calculated, the contribution degrees of the line segment 1 and the line segment 2 to the line segment 3 and the contribution degrees of the line segment 4 and the line segment 5 to the line.
When the contribution degree of the line segment 1 to the line segment 3 is calculated, the value of the line segment 1 is determined only according to the length of the line segment 1 as no associated line segment exists on the left side of the line segment 1; and determining the contribution degree of the line segment 1 to the line segment 3 according to the score of the line segment 1, the position relationship between the line segment 1 and the line segment 3 (the included angle between the line segment 1 and the line segment 3, the ratio of the distance between the line segment 1 and the line segment 3 to the preset distance) and the length of the line segment 3.
When calculating the contribution degree of the line segment 2 to the line segment 3, firstly, determining the score of the line segment 2 according to the length of the line segment 2 and the contribution degree of the line segment 1 to the line segment 2; and determining the contribution degree of the line segment 1 to the line segment 3 according to the score of the line segment 2, the position relationship between the line segment 2 and the line segment 3 (the included angle between the line segment 2 and the line segment 3, the ratio of the distance between the line segment 2 and the line segment 3 to the preset distance) and the length of the line segment 3.
When calculating the contribution degree of the line segment 4 to the line segment 3, firstly, determining the score of the line segment 4 according to the length of the line segment 4 and the contribution degree of the line segment 5 to the line segment 4; and determining the contribution degree of the line segment 4 to the line segment 3 according to the score of the line segment 4, the position relationship between the line segment 4 and the line segment 3 (the included angle between the line segment 4 and the line segment 3, the ratio of the distance between the line segment 4 and the line segment 3 to the preset distance) and the length of the line segment 4.
When the contribution degree of the line segment 5 to the line segment 3 is calculated, the value of the line segment 5 is determined only according to the length of the line segment 5 as no associated line segment exists on the right side of the line segment 5; and determining the contribution degree of the line segment 5 to the line segment 3 according to the score of the line segment 5, the position relationship between the line segment 5 and the line segment 3 (the included angle between the line segment 5 and the line segment 3, the ratio of the distance between the line segment 5 and the line segment 3 to the preset distance) and the length of the line segment 5.
After calculating the contribution of the line segment 1, the line segment 2, the line segment 4 and the line segment 5 to the line segment 3, the score of the line segment 3 is calculated according to the length of the line segment 3, the maximum value of the contribution of the line segment 1 to the line segment 3 and the contribution of the line segment 2 to the line segment 3, and the maximum value of the contribution of the line segment 4 to the line segment 3 and the contribution of the line segment 5 to the line segment 3.
Step S104: and determining the line segment with the highest score and the associated line segment of the line segment with the highest score as the lane line.
After step S103 is executed, the scores of all the line segments in the road image may be obtained, and then, the line segment with the highest score in all the line segments is determined:
Figure BDA0002315516700000121
the line segment with the highest score and the associated line segment of the line segment with the highest score are determined as the lane lines through the formula (10).
The method for identifying the lane line comprehensively considers the length of each line segment in the road image and the contribution degree of the associated line segment of the line segment to judge the possibility of the line segment as the lane line, and enables the detected lane line result to conform to the actual lane line as far as possible through combination optimization, thereby improving the robustness of lane line detection.
Based on the length of the line segments, the included angle between the line segments and the ratio of the distance between the line segments to the preset distance, the line segments which are wrongly detected to be lane lines can be filtered, such as arrows, sidewalks, road surface characters, guardrails, vehicle edges and the like, a group of optimal combinations are selected to be used as the lane lines, and the false detection rate of the lane lines is reduced.
Example two
Fig. 5 is a block diagram of a lane line detection apparatus according to a second embodiment of the present invention. Referring to fig. 5, the lane line detecting apparatus includes: a storage device 110 and a first processor 120.
The storage device 110 is used for storing program instructions. A first processor 120 calling program instructions stored in the storage device 110 for identifying all line segments in the road image when the program instructions are executed; determining the associated line segment of each line segment from other line segments, and calculating the contribution degree of the associated line segment of the line segment to the line segment, wherein the contribution degree is used for representing the influence degree of the associated line segment on the line segment as a lane line; calculating the score of each line segment according to the length of each line segment and the contribution degree of the associated line segment of the line segment to the line segment, wherein the score is used for representing the possibility that the line segment is a lane line; and determining the line segment with the highest score and the associated line segment of the line segment with the highest score as the lane line.
The first processor 120 may implement the corresponding method shown in the embodiments of fig. 1 to fig. 4 of the present invention, and the lane line identification device of this embodiment may be described with reference to the lane line identification method of the first embodiment, which is not described herein again.
In this embodiment, the storage device 110 may include a volatile memory (volatile memory), such as a random-access memory (RAM); the storage device 110 may also include a non-volatile memory (non-volatile memory), such as a flash memory (flash memory), a Hard Disk Drive (HDD) or a solid-state drive (SSD); the storage 110 may also comprise a combination of memories of the kind described above.
The first processor 120 may be a Central Processing Unit (CPU). The processor may further include a hardware chip. The hardware chip may be an application-specific integrated circuit (ASIC), a Programmable Logic Device (PLD), or a combination thereof. The PLD may be a Complex Programmable Logic Device (CPLD), a field-programmable gate array (FPGA), a General Array Logic (GAL), or any combination thereof.
EXAMPLE III
Fig. 6 is a block diagram of a vehicle according to a third embodiment of the present invention. Referring to fig. 6, the vehicle may include a vehicle body (not shown), a camera 210 fixed to the vehicle body, and a second processor 220, wherein the camera 210 is electrically connected to the second processor 220.
The photographing device 210 of the present embodiment is used to photograph an image of the road in front of the vehicle and send the image to the second processor 220. The second processor 220 is used for identifying all line segments in the road image; determining the associated line segment of each line segment from other line segments, and calculating the contribution degree of the associated line segment of the line segment to the line segment, wherein the contribution degree is used for representing the influence degree of the associated line segment on the line segment as a lane line; calculating the score of each line segment according to the length of each line segment and the contribution degree of the associated line segment of the line segment to the line segment, wherein the score is used for representing the possibility that the line segment is a lane line; and determining the line segment with the highest score and the associated line segment of the line segment with the highest score as the lane line.
The second processor 220 may implement the corresponding method shown in the embodiments of fig. 1 to fig. 4 of the present invention, and the lane line identification device of this embodiment may be described with reference to the lane line identification method of the first embodiment, which is not described herein again.
The second processor 220 of the present embodiment may be a vehicle master controller, or may be another controller provided in the vehicle. Further, taking the second processor 220 as a master controller as an example, after the second processor 220 of the present embodiment determines the lane line by the above-mentioned method, the vehicle can be controlled to run according to the determined lane line, so as to meet the requirements of the basic ADAS function and the automatic driving function.
The camera 210 may be a camera or an image sensor, and the type of the camera 210 may be selected according to the requirement.
Example four
Furthermore, an embodiment of the present invention also provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the lane line identification method of the above-described embodiment.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The above disclosure is intended to be illustrative of only some embodiments of the invention, and is not intended to limit the scope of the invention.

Claims (58)

1. A lane line identification method, the method comprising:
identifying all line segments in the road image;
for each line segment, determining a relevant line segment of the line segment from other line segments, and calculating the contribution degree of the relevant line segment of the line segment to the line segment, wherein the contribution degree is used for representing the influence degree of the relevant line segment on the line segment as a lane line;
calculating the score of each line segment according to the length of the line segment and the contribution degree of the associated line segment of the line segment to the line segment, wherein the score is used for representing the possibility that the line segment is a lane line;
and determining the line segment with the highest score and the associated line segment of the line segment with the highest score as the lane line.
2. The method of claim 1, wherein the line segment comprises:
straight and/or curved line segments.
3. The method of claim 1, wherein for each line segment, determining an associated line segment for the line segment from the other line segments comprises:
and determining the associated line segment of the line segment from other line segments according to the position relation between the line segment and other line segments and the preset prior condition of the lane line.
4. The method of claim 3, wherein the positional relationship between the line segment and other line segments comprises:
the included angle between the line segment and other line segments, or/and the spacing between the line segment and other line segments.
5. The method of claim 4, wherein the preset lane line prior condition comprises:
the included angle between the lane lines is within the range of the preset included angle, or/and the distance between the lane lines is integral multiple of the preset distance.
6. The method of claim 1, wherein calculating, for each line segment, the degree of contribution of the line segment associated with the line segment to the line segment comprises:
and calculating the contribution degree of the associated line segment of the line segment to the line segment according to the position relation between the line segment and the associated line segment of the line segment, the length of the associated line segment of the line segment and the length of the line segment.
7. The method of claim 6, wherein the positional relationship between the line segment and other line segments comprises:
the included angle between the line segment and other line segments, or/and the spacing between the line segment and other line segments.
8. The method of claim 6, wherein for each line segment, calculating the contribution degree of the line segment associated with the line segment to the line segment according to the position relationship between the line segment and the line segment associated with the line segment, the length of the line segment associated with the line segment, and the length of the line segment comprises:
calculating the score of each associated line segment according to the length of the associated line segment and the contribution degree of other associated line segments of the line segment to the associated line segment;
and aiming at each line segment, calculating the contribution degree of each associated line segment to the line segment according to the score of each associated line segment, the position relation of each associated line segment and the length of the line segment.
9. The method of claim 8, wherein calculating, for each associated segment of each segment, a score of the associated segment according to the length of the associated segment and the degree of contribution of other associated segments of the segment to the associated segment comprises:
determining a left associated line segment positioned on the left side of the line segment and a right associated line segment positioned on the right side of the line segment according to the position relation between the line segment and the associated line segment of the line segment;
calculating the score of each left associated line segment according to the length of the left associated line segment and the contribution degree of other left associated line segments of the left associated line segment to the left associated line segment;
and calculating the score of each right associated line segment according to the length of the right associated line segment and the contribution degree of other right associated line segments of the right associated line segment to the right associated line segment.
10. The method of claim 9, wherein calculating, for each line segment, the contribution degree of each associated line segment to the line segment according to the score of each associated line segment, the position relationship between each associated line segment and the line segment, and the length of the line segment comprises:
aiming at each line segment, calculating the left contribution degree of each left associated line segment to the line segment according to the score of each left associated line segment, the position relation between each left associated line segment and the length of the line segment;
and aiming at each line segment, calculating the right contribution degree of each right associated line segment to the line segment according to the score of each right associated line segment, the position relation between each right associated line segment and the length of the line segment.
11. The method of claim 10 wherein, for the ith line segment, the left contribution of the jth left associated line segment to the ith line segment
Figure FDA0002315516690000021
The calculation formula of (a) is as follows:
Figure FDA0002315516690000022
for the ith line segment, the left contribution degree of the jth left associated line segment to the ith line segment
Figure FDA0002315516690000023
The calculation formula of (a) is as follows:
Figure FDA0002315516690000024
wherein i and j are positive integers;
Liis the length of the ith line segment;
k1is a first predetermined coefficient, and k1>0;
α is the included angle between the ith line segment and the jth left line segment or jth right line segment;
delta is the ratio of the distance between the ith line segment and the jth left line segment or the jth right line segment to the preset distance;
k2is a second predetermined coefficient, and 0<k2<1;
Figure FDA0002315516690000031
The score of the jth left associated line segment;
Figure FDA0002315516690000032
is the score of the jth right-associated line segment.
12. The method of claim 10, wherein calculating the score of each line segment according to the length of the line segment and the degree of contribution of the line segment associated with the line segment to the line segment comprises:
and calculating the score of each line segment according to the length of the line segment, the maximum value of the left contribution degree of the left associated line segment of the line segment to the line segment and the maximum value of the right contribution degree of the right associated line segment of the line segment to the line segment.
13. The method of claim 12, wherein the score for each segment is the sum of the length of the segment, the maximum of the left contribution of the segment associated with the left segment to the segment, and the maximum of the right contribution of the segment associated with the right segment to the segment.
14. The method of claim 1, wherein identifying all line segments in the road image comprises: segmenting all line segment areas in the road image;
all line segments are identified based on a line segment detection algorithm.
15. The method of claim 14, wherein the segmenting all line segment regions in the road image comprises:
segmenting all line segment areas in the road image based on CNN; alternatively, the first and second electrodes may be,
and segmenting all line segment areas in the road image based on an edge detection algorithm.
16. The method of claim 14, wherein the identifying all line segments based on the line segment detection algorithm comprises:
the lengths of all line segments, the included angles between all line segments, and/or the spacings between all line segments are identified based on a line segment detection algorithm.
17. The method of claim 14 or 16, wherein the line segment detection algorithm is a hall-f-transform algorithm.
18. The method of claim 1 or 14, wherein before identifying all line segments in the road image, further comprising:
and carrying out image correction on the road image.
19. The method of claim 18, wherein the image rectifying the road image comprises:
based on the inverse perspective transformation, the road image is projected to a corresponding top view.
20. A lane line identification apparatus, comprising:
storage means for storing program instructions;
a processor calling program instructions stored in the storage device, the program instructions when executed operable to:
identifying all line segments in the road image;
for each line segment, determining a relevant line segment of the line segment from other line segments, and calculating the contribution degree of the relevant line segment of the line segment to the line segment, wherein the contribution degree is used for representing the influence degree of the relevant line segment on the line segment as a lane line;
calculating the score of each line segment according to the length of the line segment and the contribution degree of the associated line segment of the line segment to the line segment, wherein the score is used for representing the possibility that the line segment is a lane line;
and determining the line segment with the highest score and the associated line segment of the line segment with the highest score as the lane line.
21. The apparatus of claim 20, wherein the line segment comprises:
straight and/or curved line segments.
22. The apparatus of claim 20, wherein the processor is specifically configured to:
and determining the associated line segment of the line segment from other line segments according to the position relation between the line segment and other line segments and the preset prior condition of the lane line.
23. The apparatus of claim 22, wherein the positional relationship between the line segment and the other line segments comprises:
the included angle between the line segment and other line segments, or/and the spacing between the line segment and other line segments.
24. The apparatus of claim 23, wherein the predetermined lane line prior condition comprises:
the included angle between the lane lines is within the range of the preset included angle, or/and the distance between the lane lines is integral multiple of the preset distance.
25. The apparatus of claim 20, wherein the processor is specifically configured to:
and calculating the contribution degree of the associated line segment of the line segment to the line segment according to the position relation between the line segment and the associated line segment of the line segment, the length of the associated line segment of the line segment and the length of the line segment.
26. The apparatus of claim 25, wherein the positional relationship between the line segment and the other line segments comprises:
the included angle between the line segment and other line segments, or/and the spacing between the line segment and other line segments.
27. The apparatus of claim 25, wherein the processor is specifically configured to:
calculating the score of each associated line segment according to the length of the associated line segment and the contribution degree of other associated line segments of the line segment to the associated line segment;
and aiming at each line segment, calculating the contribution degree of each associated line segment to the line segment according to the score of each associated line segment, the position relation of each associated line segment and the length of the line segment.
28. The apparatus of claim 27, wherein the processor is specifically configured to:
determining a left associated line segment positioned on the left side of the line segment and a right associated line segment positioned on the right side of the line segment according to the position relation between the line segment and the associated line segment of the line segment;
calculating the score of each left associated line segment according to the length of the left associated line segment and the contribution degree of other left associated line segments of the left associated line segment to the left associated line segment;
and calculating the score of each right associated line segment according to the length of the right associated line segment and the contribution degree of other right associated line segments of the right associated line segment to the right associated line segment.
29. The apparatus of claim 28, wherein the processor is specifically configured to:
aiming at each line segment, calculating the left contribution degree of each left associated line segment to the line segment according to the score of each left associated line segment, the position relation between each left associated line segment and the length of the line segment;
and calculating the right contribution degree of each right associated line segment to each line segment according to the score of each right associated line segment, the ratio of the position relation of each right associated line segment and the line segment to a preset distance and the length of the line segment.
30. The apparatus of claim 29 wherein, for the ith line segment, the left contribution of the jth left associated line segment to the ith line segment
Figure FDA0002315516690000051
The calculation formula of (a) is as follows:
Figure FDA0002315516690000052
for the ith line segment, the left contribution degree of the jth left associated line segment to the ith line segmentThe calculation formula of (a) is as follows:
Figure FDA0002315516690000054
wherein i and j are positive integers;
Liis the length of the ith line segment;
k1is a first predetermined coefficient, and k1>0;
α is the included angle between the ith line segment and the jth left line segment or jth right line segment;
delta is the ratio of the distance between the ith line segment and the jth left line segment or the jth right line segment to the preset distance;
k2is a second predetermined coefficient, and 0<k2<1;
Figure FDA0002315516690000055
The score of the jth left associated line segment;
Figure FDA0002315516690000056
is the score of the jth right-associated line segment.
31. The apparatus of claim 29, wherein the processor is specifically configured to:
and calculating the score of each line segment according to the length of the line segment, the maximum value of the left contribution degree of the left associated line segment of the line segment to the line segment and the maximum value of the right contribution degree of the right associated line segment of the line segment to the line segment.
32. The apparatus of claim 31 wherein the score for each segment is the sum of the length of the segment, the maximum of the left contribution of the segment associated with the left segment to the segment, and the maximum of the right contribution of the segment associated with the right segment to the segment.
33. The apparatus of claim 20, wherein the processor is specifically configured to:
segmenting all line segment areas in the road image;
all line segments are identified based on a line segment detection algorithm.
34. The apparatus of claim 33, wherein the processor is specifically configured to:
segmenting all line segment areas in the road image based on CNN; alternatively, the first and second electrodes may be,
and segmenting all line segment areas in the road image based on an edge detection algorithm.
35. The apparatus of claim 33, wherein the processor is specifically configured to:
the lengths of all line segments, the included angles between all line segments, and/or the spacings between all line segments are identified based on a line segment detection algorithm.
36. The apparatus of claim 33 or 35, wherein the line segment detection algorithm is a hall-f-transform algorithm.
37. The apparatus of claim 20 or 33, wherein before the processor identifies all line segments in the road image, the processor is further configured to:
and carrying out image correction on the road image.
38. The apparatus of claim 37, wherein the processor is specifically configured to:
based on the inverse perspective transformation, the road image is projected to a corresponding top view.
39. A vehicle, characterized by comprising:
a vehicle body;
a camera fixed on the vehicle body; and
the processor is electrically connected with the shooting device;
the shooting device is used for shooting road images in front of the vehicle and sending the road images to the processor, and the processor is used for:
identifying all line segments in the road image;
for each line segment, determining a relevant line segment of the line segment from other line segments, and calculating the contribution degree of the relevant line segment of the line segment to the line segment, wherein the contribution degree is used for representing the influence degree of the relevant line segment on the line segment as a lane line;
calculating the score of each line segment according to the length of the line segment and the contribution degree of the associated line segment of the line segment to the line segment, wherein the score is used for representing the possibility that the line segment is a lane line;
and determining the line segment with the highest score and the associated line segment of the line segment with the highest score as the lane line.
40. The vehicle of claim 39, characterized in that the line segment comprises:
straight and/or curved line segments.
41. The vehicle of claim 39, wherein the processor is specifically configured to:
and determining the associated line segment of the line segment from other line segments according to the position relation between the line segment and other line segments and the preset prior condition of the lane line.
42. The vehicle of claim 41, characterized in that the positional relationship between the line segment and other line segments comprises:
the included angle between the line segment and other line segments, or/and the spacing between the line segment and other line segments.
43. The vehicle of claim 42, wherein the preset lane line prior condition comprises:
the included angle between the lane lines is within the range of the preset included angle, or/and the distance between the lane lines is integral multiple of the preset distance.
44. The vehicle of claim 39, wherein the processor is specifically configured to:
and calculating the contribution degree of the associated line segment of the line segment to the line segment according to the position relation between the line segment and the associated line segment of the line segment, the length of the associated line segment of the line segment and the length of the line segment.
45. The vehicle of claim 44, wherein the positional relationship between the line segment and the other line segments comprises:
the included angle between the line segment and other line segments, or/and the spacing between the line segment and other line segments.
46. The vehicle of claim 44, wherein the processor is specifically configured to:
calculating the score of each associated line segment according to the length of the associated line segment and the contribution degree of other associated line segments of the line segment to the associated line segment;
and aiming at each line segment, calculating the contribution degree of each associated line segment to the line segment according to the score of each associated line segment, the position relation of each associated line segment and the length of the line segment.
47. The vehicle of claim 46, wherein the processor is specifically configured to:
determining a left associated line segment positioned on the left side of the line segment and a right associated line segment positioned on the right side of the line segment according to the position relation between the line segment and the associated line segment of the line segment;
calculating the score of each left associated line segment according to the length of the left associated line segment and the contribution degree of other left associated line segments of the left associated line segment to the left associated line segment;
and calculating the score of each right associated line segment according to the length of the right associated line segment and the contribution degree of other right associated line segments of the right associated line segment to the right associated line segment.
48. The vehicle of claim 47, wherein the processor is specifically configured to:
aiming at each line segment, calculating the left contribution degree of each left associated line segment to the line segment according to the score of each left associated line segment, the position relation between each left associated line segment and the length of the line segment;
and aiming at each line segment, calculating the right contribution degree of each right associated line segment to the line segment according to the score of each right associated line segment, the position relation between each right associated line segment and the length of the line segment.
49. The vehicle of claim 48, characterized in that for the ith line segment, the left contribution of the jth left associated line segment to the ith line segmentThe calculation formula of (a) is as follows:
Figure FDA0002315516690000082
for the ith line segment, the left contribution degree of the jth left associated line segment to the ith line segment
Figure FDA0002315516690000083
The calculation formula of (a) is as follows:
Figure FDA0002315516690000084
wherein i and j are positive integers;
Liis the length of the ith line segment;
k1is a first predetermined coefficient, and k1>0;
α is the included angle between the ith line segment and the jth left line segment or jth right line segment;
delta is the ratio of the distance between the ith line segment and the jth left line segment or the jth right line segment to the preset distance;
k2is a second predetermined coefficient, and 0<k2<1;
Figure FDA0002315516690000085
The score of the jth left associated line segment;
Figure FDA0002315516690000086
is the score of the jth right-associated line segment.
50. The vehicle of claim 48, wherein the processor is specifically configured to:
and calculating the score of each line segment according to the length of the line segment, the maximum value of the left contribution degree of the left associated line segment of the line segment to the line segment and the maximum value of the right contribution degree of the right associated line segment of the line segment to the line segment.
51. The vehicle of claim 50, wherein the score for each segment is a sum of a length of the segment, a maximum of a left contribution of a left associated segment of the segment to the segment, and a maximum of a right contribution of a right associated segment of the segment to the segment.
52. The vehicle of claim 39, wherein the processor is specifically configured to:
segmenting all line segment areas in the road image;
all line segments are identified based on a line segment detection algorithm.
53. The vehicle of claim 52, wherein the processor is specifically configured to:
segmenting all line segment areas in the road image based on CNN; alternatively, the first and second electrodes may be,
and segmenting all line segment areas in the road image based on an edge detection algorithm.
54. The vehicle of claim 52, wherein the processor is specifically configured to:
the lengths of all line segments, the included angles between all line segments, and/or the spacings between all line segments are identified based on a line segment detection algorithm.
55. The vehicle of claim 52 or 54, characterized in that the line segment detection algorithm is a Hall-effect transformation algorithm.
56. The vehicle of claim 39 or 52, wherein before the processor identifies all line segments in the road image, the processor is further configured to:
and carrying out image correction on the road image.
57. The vehicle of claim 56, wherein the processor is specifically configured to:
based on the inverse perspective transformation, the road image is projected to a corresponding top view.
58. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the lane line identification method according to any one of claims 1 to 19.
CN201880039256.XA 2018-10-31 2018-10-31 Lane line identification method and device and vehicle Active CN110770741B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/112894 WO2020087322A1 (en) 2018-10-31 2018-10-31 Lane line recognition method and device, and vehicle

Publications (2)

Publication Number Publication Date
CN110770741A true CN110770741A (en) 2020-02-07
CN110770741B CN110770741B (en) 2024-05-03

Family

ID=69328785

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880039256.XA Active CN110770741B (en) 2018-10-31 2018-10-31 Lane line identification method and device and vehicle

Country Status (2)

Country Link
CN (1) CN110770741B (en)
WO (1) WO2020087322A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111311675A (en) * 2020-02-11 2020-06-19 腾讯科技(深圳)有限公司 Vehicle positioning method, device, equipment and storage medium
CN112347983A (en) * 2020-11-27 2021-02-09 腾讯科技(深圳)有限公司 Lane line detection processing method, lane line detection processing device, computer equipment and storage medium
CN112498342A (en) * 2020-11-26 2021-03-16 潍柴动力股份有限公司 Pedestrian collision prediction method and system

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022126341A (en) * 2021-02-18 2022-08-30 本田技研工業株式会社 Vehicle control device, vehicle control method and program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103940434A (en) * 2014-04-01 2014-07-23 西安交通大学 Real-time lane line detecting system based on monocular vision and inertial navigation unit
JP2015194397A (en) * 2014-03-31 2015-11-05 株式会社デンソーアイティーラボラトリ Vehicle location detection device, vehicle location detection method, vehicle location detection computer program and vehicle location detection system
CN105718870A (en) * 2016-01-15 2016-06-29 武汉光庭科技有限公司 Road marking line extracting method based on forward camera head in automatic driving
CN107229908A (en) * 2017-05-16 2017-10-03 浙江理工大学 A kind of method for detecting lane lines

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008091565A1 (en) * 2007-01-23 2008-07-31 Valeo Schalter & Sensoren Gmbh Method and system for universal lane boundary detection
CN102663356B (en) * 2012-03-28 2015-04-08 柳州博实唯汽车科技有限公司 Method for extraction and deviation warning of lane line
CN103440785B (en) * 2013-08-08 2015-08-05 华南师范大学 One is traffic lane offset warning method fast
CN104063877B (en) * 2014-07-16 2017-05-24 中电海康集团有限公司 Hybrid judgment identification method for candidate lane lines
CN104700072B (en) * 2015-02-06 2018-01-19 中国科学院合肥物质科学研究院 Recognition methods based on lane line historical frames

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015194397A (en) * 2014-03-31 2015-11-05 株式会社デンソーアイティーラボラトリ Vehicle location detection device, vehicle location detection method, vehicle location detection computer program and vehicle location detection system
CN103940434A (en) * 2014-04-01 2014-07-23 西安交通大学 Real-time lane line detecting system based on monocular vision and inertial navigation unit
CN105718870A (en) * 2016-01-15 2016-06-29 武汉光庭科技有限公司 Road marking line extracting method based on forward camera head in automatic driving
CN107229908A (en) * 2017-05-16 2017-10-03 浙江理工大学 A kind of method for detecting lane lines

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111311675A (en) * 2020-02-11 2020-06-19 腾讯科技(深圳)有限公司 Vehicle positioning method, device, equipment and storage medium
CN111311675B (en) * 2020-02-11 2022-09-16 腾讯科技(深圳)有限公司 Vehicle positioning method, device, equipment and storage medium
CN112498342A (en) * 2020-11-26 2021-03-16 潍柴动力股份有限公司 Pedestrian collision prediction method and system
CN112347983A (en) * 2020-11-27 2021-02-09 腾讯科技(深圳)有限公司 Lane line detection processing method, lane line detection processing device, computer equipment and storage medium
CN112347983B (en) * 2020-11-27 2021-12-14 腾讯科技(深圳)有限公司 Lane line detection processing method, lane line detection processing device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN110770741B (en) 2024-05-03
WO2020087322A1 (en) 2020-05-07

Similar Documents

Publication Publication Date Title
CN110770741B (en) Lane line identification method and device and vehicle
JP6347827B2 (en) Method, apparatus and device for detecting lane boundaries
Zhou et al. A novel lane detection based on geometrical model and gabor filter
CN107895375B (en) Complex road route extraction method based on visual multi-features
CN110490936B (en) Calibration method, device and equipment of vehicle camera and readable storage medium
JP5385105B2 (en) Image search method and system
US8098933B2 (en) Method and apparatus for partitioning an object from an image
CN110088766B (en) Lane line recognition method, lane line recognition device, and nonvolatile storage medium
Youjin et al. A robust lane detection method based on vanishing point estimation
CN108416306B (en) Continuous obstacle detection method, device, equipment and storage medium
US10235579B2 (en) Vanishing point correction apparatus and method
CN112348902A (en) Method, device and system for calibrating installation deviation angle of road end camera
JP5310027B2 (en) Lane recognition device and lane recognition method
KR101461108B1 (en) Recognition device, vehicle model recognition apparatus and method
CN110991264A (en) Front vehicle detection method and device
WO2019167238A1 (en) Image processing device and image processing method
CN117078717A (en) Road vehicle track extraction method based on unmanned plane monocular camera
CN113256701B (en) Distance acquisition method, device, equipment and readable storage medium
CN110809767B (en) Advanced driver assistance system and method
CN112184827B (en) Method and device for calibrating multiple cameras
CN111126109B (en) Lane line identification method and device and electronic equipment
CN115507815A (en) Target ranging method and device and vehicle
KR102629639B1 (en) Apparatus and method for determining position of dual camera for vehicle
JP6266340B2 (en) Lane identification device and lane identification method
CN115131273A (en) Information processing method, ranging method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20240515

Address after: Building 3, Xunmei Science and Technology Plaza, No. 8 Keyuan Road, Science and Technology Park Community, Yuehai Street, Nanshan District, Shenzhen City, Guangdong Province, 518057, 1634

Patentee after: Shenzhen Zhuoyu Technology Co.,Ltd.

Country or region after: China

Address before: 518057 Shenzhen Nanshan High-tech Zone, Shenzhen, Guangdong Province, 6/F, Shenzhen Industry, Education and Research Building, Hong Kong University of Science and Technology, No. 9 Yuexingdao, South District, Nanshan District, Shenzhen City, Guangdong Province

Patentee before: SZ DJI TECHNOLOGY Co.,Ltd.

Country or region before: China

TR01 Transfer of patent right