CN111563463A - Method and device for identifying road lane lines, electronic equipment and storage medium - Google Patents

Method and device for identifying road lane lines, electronic equipment and storage medium Download PDF

Info

Publication number
CN111563463A
CN111563463A CN202010393718.7A CN202010393718A CN111563463A CN 111563463 A CN111563463 A CN 111563463A CN 202010393718 A CN202010393718 A CN 202010393718A CN 111563463 A CN111563463 A CN 111563463A
Authority
CN
China
Prior art keywords
line
lane
road surface
lane line
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010393718.7A
Other languages
Chinese (zh)
Inventor
周康明
张宪法
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Eye Control Technology Co Ltd
Original Assignee
Shanghai Eye Control Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Eye Control Technology Co Ltd filed Critical Shanghai Eye Control Technology Co Ltd
Priority to CN202010393718.7A priority Critical patent/CN111563463A/en
Publication of CN111563463A publication Critical patent/CN111563463A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/273Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion removing elements interfering with the pattern to be recognised
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a method and a device for identifying a road lane line, electronic equipment and a storage medium, wherein a target road image is obtained, and the target road image is subjected to image segmentation and extraction to obtain a predicted lane line and a reference object; the predicted lane line is corrected by the reference object to obtain an actual lane line, the actual lane line is determined as a target road lane line, and the reference object and the predicted lane line have a certain reference relation, so that the predicted lane line is corrected according to the reference relation, namely the identification characteristic of the lane line is increased, the image identification result of the lane line can be corrected, the accuracy of road lane line identification is improved, the problem of low accuracy of vehicle illegal driving judgment caused by wrong identification of the road lane line is solved, and the accuracy and the timeliness of the vehicle illegal driving judgment are improved.

Description

Method and device for identifying road lane lines, electronic equipment and storage medium
Technical Field
The present invention relates to the field of image recognition technologies, and in particular, to a method and an apparatus for recognizing a lane line on a road surface, an electronic device, and a storage medium.
Background
In the artificial intelligence traffic field, all kinds of information at traffic crossing are very important to the analysis crossing condition, and traffic police's army vehicle illegal audit personnel can judge whether to punish and make a dash across the red light according to the vehicle has not crossed the stop line in red light occasionally, can judge whether to have violating lane change and line ball according to the position of road surface solid line and vehicle, can judge whether to have the retrograde motion according to road surface guide line and vehicle direction of travel etc.. The road lane line is used for limiting the driving direction of the vehicle, and is particularly important for identifying whether the vehicle drives illegally.
In the prior art, an image acquisition device such as a video camera installed at a traffic intersection is generally used for acquiring and identifying a road surface image, and whether traffic illegal behaviors such as line pressing, illegal lane changing, retrograde motion and the like exist in a vehicle is judged according to an image identification result.
However, in the prior art, when a road surface image of a traffic intersection is recognized, due to light, dirt on the road surface and the like, a recognition error of a road surface lane line is caused, so that the accuracy of illegal driving judgment on a vehicle is low, and the timeliness is poor due to repeated recognition.
Disclosure of Invention
The invention provides a method and a device for identifying a road lane line, electronic equipment and a storage medium, which are used for solving the problems of low accuracy of vehicle illegal driving judgment and poor timeliness caused by repeated identification due to the identification error of the road lane line.
According to a first aspect of the disclosed embodiments, the present invention provides a method, an apparatus, an electronic device and a storage medium for identifying a lane line on a road surface, wherein the method comprises:
acquiring a target pavement image;
carrying out image segmentation and extraction on the target pavement image to obtain a predicted lane line and a reference object;
correcting the predicted lane line by using the reference object to obtain an actual lane line;
and determining the actual lane line as the target road lane line.
Optionally, the image segmentation and extraction of the target road surface image to obtain a predicted lane line and a reference object includes:
acquiring a road surface feature recognition model trained to be convergent;
and carrying out segmentation and extraction on the target road surface image through the road surface feature recognition model so as to obtain a predicted lane line and a reference object.
Optionally, the reference comprises at least one road guide line and a road stop line; before the obtaining of the road surface feature recognition model trained to converge, the method further includes:
obtaining a training sample of a road surface feature recognition model, wherein the training sample is a training road surface image, and the training road surface image comprises marked lane lines, road surface guide lines and road surface stop lines;
and training an initial road surface feature recognition model by using the training sample until a preset model convergence condition is met, so as to obtain the road surface feature recognition model trained to be converged.
Optionally, the modifying the predicted lane line by using the reference object to obtain an actual lane line includes:
determining a noise lane line according to the relation between the predicted lane line and the reference object;
and removing the noise lane line in the predicted lane line to obtain the actual lane line.
Optionally, the reference object includes at least one road surface guide line, the predicted lane line includes at least one lane boundary line, and the determining a noise lane line according to a relationship between the predicted lane line and the reference object includes:
and determining a first noise lane line according to the relation between the lane boundary line and the road surface guide line.
Optionally, the determining a first noisy lane line according to a relationship between the lane boundary line and the road surface guide line includes:
acquiring the length of a lane boundary line and the length of a road surface guide line;
and if the length of the lane boundary line is smaller than that of the shortest road guide line in the road surface guide lines, determining the lane boundary line as a first noise lane line.
Optionally, the determining a first noisy lane line according to a relationship between the lane boundary line and the road surface guide line includes:
determining the vertical distance between each lane boundary line and each road surface guide line;
and if the vertical distance between the lane boundary line and the road surface guide line is smaller than a first preset distance threshold, determining the lane boundary line as a first noise lane line.
Optionally, the determining the noise lane line according to the relationship between the predicted lane line and the reference object includes:
and determining a second noise lane line according to the relation between the lane boundary line and the road surface stop line.
Optionally, the determining a second noisy lane line according to the relationship between the lane boundary line and the road stop line includes:
determining a target road surface area and a non-target road surface area in the target road surface image according to the road surface stop line;
and if the lane boundary line is in the non-target road surface area, determining the lane boundary line as a second noise lane line.
Optionally, the determining the noise lane line according to the relationship between the predicted lane line and the reference object includes:
and determining a third noise lane line according to the relation between the lane boundary line and the vanishing point.
Optionally, the determining a third noise lane line according to the relationship between the lane boundary line and the vanishing point includes:
determining a central vanishing point according to the position of each vanishing point;
and if the vertical distance between the lane boundary line and the center vanishing point is greater than a second preset distance threshold, determining the lane boundary line as a third noise lane line.
Optionally, after determining the actual lane line as the target road lane line, the method further includes:
and saving the target road surface lane line as a structural file corresponding to the target road surface.
According to a second aspect of the disclosed embodiments, the present invention provides a road surface lane line recognition apparatus, including:
the image acquisition module is used for acquiring a target road surface image;
the image extraction module is used for carrying out image segmentation extraction on the target road surface image so as to obtain a predicted lane line and a reference object;
the lane line correction module is used for correcting the predicted lane line by using the reference object so as to obtain an actual lane line;
and the lane line determining module is used for determining the actual lane line as the target road lane line.
Optionally, the image extraction module is specifically configured to:
acquiring a road surface feature recognition model trained to be convergent;
and carrying out segmentation and extraction on the target road surface image through the road surface feature recognition model so as to obtain a predicted lane line and a reference object.
Optionally, the reference comprises at least one road guide line and a road stop line; the image extraction module is further configured to:
obtaining a training sample of a road surface feature recognition model, wherein the training sample is a training road surface image, and the training road surface image comprises marked lane lines, road surface guide lines and road surface stop lines;
and training an initial road surface feature recognition model by using the training sample until a preset model convergence condition is met, so as to obtain the road surface feature recognition model trained to be converged.
Optionally, the lane line correction module is specifically configured to:
determining a noise lane line according to the relation between the predicted lane line and the reference object;
and removing the noise lane line in the predicted lane line to obtain the actual lane line.
Optionally, the reference object includes at least one road surface guide line, the predicted lane line includes at least one lane boundary line, and the lane line correction module is specifically configured to, when determining a noise lane line according to a relationship between the predicted lane line and the reference object:
and determining a first noise lane line according to the relation between the lane boundary line and the road surface guide line.
Optionally, when the lane line correction module determines the first noise lane line according to the relationship between the lane boundary line and the road surface guide line, the lane line correction module is specifically configured to:
acquiring the length of a lane boundary line and the length of a road surface guide line;
and if the length of the lane boundary line is smaller than that of the shortest road guide line in the road surface guide lines, determining the lane boundary line as a first noise lane line.
Optionally, when the lane line correction module determines the first noise lane line according to the relationship between the lane boundary line and the road surface guide line, the lane line correction module is specifically configured to:
determining the vertical distance between each lane boundary line and each road surface guide line;
and if the vertical distance between the lane boundary line and the road surface guide line is smaller than a first preset distance threshold, determining the lane boundary line as a first noise lane line.
Optionally, the reference object includes a road stop line, the predicted lane line includes more than one lane boundary line, and the lane line correction module is specifically configured to, when determining a noise lane line according to a relationship between the predicted lane line and the reference object:
and determining a second noise lane line according to the relation between the lane boundary line and the road surface stop line.
Optionally, when the lane line correction module determines the second noise lane line according to the relationship between the lane boundary line and the road surface stop line, the lane line correction module is specifically configured to:
determining a target road surface area and a non-target road surface area in the target road surface image according to the road surface stop line;
and if the lane boundary line is in the non-target road surface area, determining the lane boundary line as a second noise lane line.
Optionally, the reference object includes a vanishing point, the predicted lane line includes more than two lane boundary lines, and the lane line correction module is specifically configured to, when determining a noise lane line according to a relationship between the predicted lane line and the reference object:
and determining a third noise lane line according to the relation between the lane boundary line and the vanishing point.
Optionally, when the lane line correction module determines a third noise lane line according to the relationship between the lane boundary line and the vanishing point, the lane line correction module is specifically configured to:
determining a central vanishing point according to the position of each vanishing point;
and if the vertical distance between the lane boundary line and the center vanishing point is greater than a second preset distance threshold, determining the lane boundary line as a third noise lane line.
Optionally, the lane line determination module is further configured to:
and saving the target road surface lane line as a structural file corresponding to the target road surface.
According to a third aspect of the embodiments of the present disclosure, the present invention provides an electronic apparatus including: a memory, a processor, and a computer program;
wherein the computer program is stored in the memory and configured to be executed by the processor in the method for identifying a lane line on a road surface according to any one of the first aspect of the embodiments of the present disclosure.
According to a fourth aspect of the embodiments of the present disclosure, the present disclosure provides a computer-readable storage medium, in which computer-executable instructions are stored, and when the computer-executable instructions are executed by a processor, the computer-readable storage medium is used for implementing the method for identifying a lane line on a road surface according to any one of the first aspect of the embodiments of the present disclosure.
According to the method, the device, the electronic equipment and the storage medium for recognizing the road lane lines, the target road image is obtained, and image segmentation and extraction are carried out on the target road image to obtain the predicted lane lines and the reference object; the reference object is utilized to correct the predicted lane line to obtain an actual lane line, the actual lane line is determined as the target road lane line, and the reference object and the predicted lane line have a certain reference relation, so that the predicted lane line is corrected according to the reference relation, namely the identification characteristic of the lane line is increased, the image identification result of the lane line can be corrected, the accuracy of road lane line identification is improved, the problem of low accuracy of vehicle illegal driving judgment caused by wrong identification of the road lane line is solved, and the accuracy and the timeliness of the vehicle illegal driving judgment are improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is an application scene diagram of a method for identifying a lane line on a road according to an embodiment of the present invention;
fig. 2 is a flowchart of a method for identifying a lane line on a road according to an embodiment of the present invention;
fig. 3 is a flowchart of a method for identifying a lane line on a road according to another embodiment of the present invention;
fig. 4 is a flowchart of a method for identifying a lane line on a road according to still another embodiment of the present invention;
FIG. 5 is a flowchart of step S304 in the embodiment shown in FIG. 4;
fig. 6 is a schematic diagram illustrating a dimensional relationship between a lane boundary line and a road surface guide line in a first implementation manner of step S304;
fig. 7 is a schematic view of a positional relationship between a lane boundary line and a road surface guide line in the second implementation manner of step S304;
FIG. 8 is a flowchart of step S305 in the embodiment shown in FIG. 4;
fig. 9 is a schematic view showing the relationship between the lane boundary line and the road surface stop line in step S305;
FIG. 10 is a flowchart of step S306 in the embodiment shown in FIG. 4;
fig. 11 is a schematic diagram illustrating the relationship between the lane boundary line and the vanishing point in step S306;
fig. 12 is a schematic structural view of a road lane line recognition apparatus according to an embodiment of the present invention;
fig. 13 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
With the foregoing drawings in mind, certain embodiments of the disclosure have been shown and described in more detail below. These drawings and written description are not intended to limit the scope of the disclosed concepts in any way, but rather to illustrate the concepts of the disclosure to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The terms to which the present invention relates will be explained first:
vanishing point: the visual intersection points of the parallel lines are, according to the perspective principle, when a group of parallel lines extending to a far distance appears on the two-dimensional image, the group of parallel lines can intersect at a point at the far distance, the intersection point is a vanishing point, for example, a highway lane line extending to the far distance in the plane image, and as the highway lines are parallel to each other, the two parallel highway lines can intersect at the vanishing point along with the increase of the visual distance.
The following explains an application scenario of the embodiment of the present invention:
fig. 1 is an application scene diagram of the method for identifying a road lane line according to the embodiment of the present invention, as shown in fig. 1, in the application scene, the method for identifying a road lane line according to the embodiment of the present invention is applied to an electronic device 1, the electronic device includes an image capturing device 11, the image capturing device 11 is disposed at a traffic intersection, and the electronic device 1 captures a video image of the traffic intersection through the image capturing device 11 and identifies the road line by using the method for identifying a road lane line according to the embodiment of the present invention. The lane line recognition result 2 can be displayed through the display device 12 on the electronic device 1, and the lane line recognition result 2 is stored in a structural file manner, and the electronic device 1 judges illegal driving of the vehicle at the traffic intersection according to the lane line recognition result 2.
In the prior art, lane lines of traffic roads are recognized, and due to different layouts and structures of the traffic roads, generalized image recognition is generally performed after road images are collected, and recognition results are directly used as actual lane lines. However, the size and shape of an object are determined only by the information of pixel points in an image, and the method for recognizing the lane line is realized. Meanwhile, a certain reference relationship exists among all objects in the image information according to prior knowledge, and the reference relationship is not considered in the prior generalized image identification process, so that effective information is wasted, the identification accuracy of the road lane lines is low, and the timeliness is influenced by repeated identification.
The following describes the technical solutions of the present invention and how to solve the above technical problems with specific embodiments. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present invention will be described below with reference to the accompanying drawings.
Fig. 2 is a flowchart of a method for identifying a lane line on a road surface according to an embodiment of the present invention, and as shown in fig. 2, the method for identifying a lane line on a road surface according to the embodiment includes the following steps:
step S101, acquiring a target road surface image.
The target road surface image may be a two-dimensional image of the target road surface, such as a two-dimensional photograph or video. There are various ways to acquire the target road surface image, for example, directly acquire the two-dimensional image of the target road surface through the image acquisition device, or receive the original or processed two-dimensional image transmitted by other electronic devices, which is not limited herein.
Step S102, image segmentation and extraction are carried out on the target road surface image so as to obtain a predicted lane line and a reference object.
Image segmentation extraction includes segmentation and extraction, i.e., image recognition, of different objects in an image. Specifically, the image segmentation extraction is performed on the target road surface image, that is, different objects in the target road surface image, such as various lane lines, traffic indication lines, zebra crossings, or different objects such as traffic fences and traffic lights, are identified and distinguished, wherein the lane lines after segmentation extraction are the result of image identification, that is, predicted lane lines; the reference object is other objects except the lane line, and has reference relation with the size and the position of the lane line.
Step S103, correcting the predicted lane line by using the reference object to obtain an actual lane line.
Specifically, since a certain reference relationship exists between the reference object and the predicted lane line, for example, the lane line is parallel to a straight guidance line in the middle of the lane, the lane line is not parallel to a lane stop line, and such a priori knowledge is not considered in the previous image recognition process, the predicted lane line is corrected according to such a reference relationship, which is equivalent to increasing the recognition feature of the lane line, and the image recognition result of the lane line can be corrected.
And step S104, determining the actual lane line as the target road lane line.
And determining the corrected actual lane line as the lane line of the target road surface, so that the identification accuracy of the lane line is improved.
In the embodiment, the target road surface image is obtained, and image segmentation and extraction are carried out on the target road surface image to obtain a predicted lane line and a reference object; the predicted lane line is corrected by the reference object to obtain an actual lane line, the actual lane line is determined as a target road lane line, and the reference object and the predicted lane line have a certain reference relation, so that the predicted lane line is corrected according to the reference relation, namely the identification characteristic of the lane line is increased, the image identification result of the lane line can be corrected, the accuracy of road lane line identification is improved, the problem of low accuracy of vehicle illegal driving judgment caused by wrong identification of the road lane line is solved, and the accuracy and the timeliness of the vehicle illegal driving judgment are improved.
Fig. 3 is a flowchart of a method for identifying a lane line on a road surface according to another embodiment of the present invention, and as shown in fig. 3, the method for identifying a lane line on a road surface according to the embodiment of the present invention further refines steps S102 to S103 on the basis of the method for identifying a lane line on a road surface according to the embodiment shown in fig. 2, and then the method for identifying a lane line on a road surface according to the present embodiment includes the following steps:
step S201, a target road surface image is acquired.
Step S202, a road surface feature recognition model trained to be convergent is obtained.
Specifically, the road surface feature recognition model is a mathematical model for processing an image of a traffic road surface and recognizing specific object features therein, and for example, specific objects such as a vehicle, a lane line, a guide line, and a zebra crossing in the image of the traffic road surface can be segmented and extracted by the road surface feature recognition model.
Optionally, the road surface feature recognition model is implemented by using a semantic-based segmentation network PSPNet.
The implementation manner of the road surface feature recognition model is various, for example, the road surface feature recognition model based on the neural network, and it is understood that the road surface feature recognition model is not limited to the neural network model, and other machine learning algorithms can also implement the road surface feature recognition model, and are not specifically limited herein.
Step S203, the target road surface image is segmented and extracted through the road surface feature recognition model to obtain a predicted lane line and a reference object.
In the step of this embodiment, because the road surface feature recognition model is a mathematical model constructed by pertinently distinguishing specific object features on a traffic road surface, the object features on the traffic road surface, namely the predicted lane lines and the reference objects, can be better recognized, the recognition accuracy of the predicted lane lines and the reference objects is improved, the interference of other irrelevant objects on the recognition of the predicted lane lines and the reference objects is reduced, and the accuracy and the efficiency of the algorithm are improved.
Optionally, the reference comprises at least one road guide line and a road stop line; before obtaining the road surface characteristic recognition model trained to be convergent, the method further comprises the following steps:
and obtaining a training sample of the road surface characteristic identification model, wherein the training sample is a training road surface image, and the training road surface image comprises marked lane lines, road surface guide lines and road surface stop lines.
And training the initial road surface feature recognition model by using the training sample until a preset model convergence condition is met, so as to obtain a road surface feature recognition model trained to be converged.
And step S204, determining a noise lane line according to the relation between the predicted lane line and the reference object.
Specifically, when the image is not clear due to dark light or dirt on the road surface, the collected image pixel point information often cannot express the real image pixel point information, and therefore, the lane line on the road surface is often identified by mistake. That is, the predicted lane lines include actual lane lines that are predicted correctly and noisy lane lines that are predicted incorrectly. The noisy lane line includes a shape and a marker erroneously recognized as a lane line due to similarity in shape, for example, a guide line, a zebra crossing inside a target road surface; guideways, zebra crossings, road courses, and other shapes or markers in the image other than the target pavement.
According to the relation between the reference object and the predicted lane line, the lane line which does not accord with the priori knowledge in the predicted lane line can be determined as the noise lane line, for example, according to the priori knowledge, the length of the lane line is longer than that of the guide line, so that the lane line which is shorter than the guide line can be determined as the noise lane line. Therefore, through different prior knowledge, a plurality of determination methods for determining the noise lane lines can be correspondingly generated, and the specific manner for determining the noise lane lines is not limited here.
In step S205, the noise lane line in the predicted lane line is removed to obtain the actual lane line.
The predicted lane line contains a noise lane line, and if the illegal vehicle driving behavior is judged by using the wrong-prediction noise lane line, the illegal driving judgment error can be caused, so that an actual lane line is obtained by removing the noise lane line in the predicted lane line, and the subsequent illegal driving judgment is carried out by using the actual lane line, so that the accuracy of the illegal driving judgment can be effectively improved.
In step S206, the actual lane line is determined as the target road lane line.
In this embodiment, the implementation manners of step S201 and step S206 are the same as the implementation manners of step S101 and step S204 in the embodiment shown in fig. 2 of the present invention, and are not described again.
Fig. 4 is a flowchart of a method for identifying a road lane line according to still another embodiment of the present invention, and as shown in fig. 4, the method for identifying a road lane line according to the embodiment of the present invention further refines step S204 on the basis of the method for identifying a road lane line according to the embodiment of fig. 3, and adds a step of saving a target road lane line as a structural file after step S206, so that the method for identifying a road lane line according to the embodiment of the present invention includes the following steps:
step S301, a target road surface image is acquired.
Step S302, a road surface feature recognition model trained to be convergent is obtained.
Step S303, the target road surface image is segmented and extracted through the road surface feature recognition model to obtain a predicted lane line and a reference object.
Optionally, the reference object comprises at least one road surface guide line and the predicted lane line comprises at least one lane boundary line.
Step S304, a first noise lane line is determined according to the relation between the lane boundary line and the road surface guide line.
Specifically, a certain reference relationship exists between lane boundary lines and a road surface guide line, generally, the road surface guide line is arranged in the middle of a traffic lane formed by two adjacent lane boundary lines, and has a specific size and position relationship with the adjacent lane boundary lines on two sides, according to the specific size and position relationship, the lane boundary line in the predicted lane line can be discriminated, and the lane boundary line which does not conform to the specific size and position relationship is the first noise lane line.
Optionally, step S304 includes two different implementation manners, the two implementation manners in step S304 may be executed separately or simultaneously according to specific situations, and when the two implementation manners are executed simultaneously, the order of the two implementation manners is not specifically limited.
Optionally, as shown in fig. 5, in the step of the present embodiment, the first implementation manner and the second implementation manner in step S304 are performed in sequence. The first implementation manner includes two specific implementation steps S3041 and S3042:
in step S3041, the length of the lane boundary line and the length of the road surface guide line are acquired.
In step S3042, if the length of the lane boundary line is smaller than the length of the shortest one of the road surface guide lines, the lane boundary line is determined as the first noise lane line.
Optionally, the specific method for determining the first noisy lane line includes:
let the contour of the lane boundary line be L ═ L1,l2,l3,…,li,…lnThe outline of the road surface guide line is G ═ G }1,g2,g3,…,gj,…,gmS-S-stop line1,s2,s3,…,si,…,smIn which liIndicates the i-th strip lane boundary line profile, gjDenotes the profile of the j-th road guide line, siRepresenting the ith stop-line profile; let lRiIs represented byiThe smallest circumscribed rectangle of the outline of (a),
Figure BDA0002486871600000111
represents a rectangle lRiWidth and height of (d); let gRjDenotes gjThe smallest circumscribed rectangle of the outline of (a),
Figure BDA0002486871600000112
represents a rectangle gRjWidth and height of (d); let lFiIs represented byiThe line segment of (a) is fitted,
Figure BDA0002486871600000113
is represented byFiThe upper and lower vertices on image X,
Figure BDA0002486871600000114
is represented byFiSlope and bias.
Optionally, before comparing the length of the lane boundary line with the length of the road surface guide line, pre-screening may be performed according to the length of the lane boundary line, so as to remove the lane boundary line with an excessively small length, thereby improving the data processing efficiency.
In particular, if
Figure BDA0002486871600000115
And is
Figure BDA0002486871600000116
Then liAnd reserving, otherwise, rejecting, wherein th1 is a pixel point, and optionally, th1 is 50.
Optionally, before comparing the length of the lane boundary line with the length of the road surface guide line, the road surface guide line may be pre-screened according to the position information of the road surface guide line, so as to remove unreasonable road surface guide lines from the road surface guide line, thereby improving the calculation accuracy.
Specifically, a guide line profile g is calculatedjCenter point C ofgj(x, y), and judging the center point CgjWhether the y value of (x, y) is greater than 0.5h, where h is the image height. If so, retain liOtherwise, filtering is performed, because the line pressing penalty only applies to the intersection close to the camera in the image, the intersection is generally in the lower half part of the image, and the intersection far away from the image is generally in the upper half part of the image, so that the threshold is set to be 0.5h, namely, one half height of the image.
Calculating l by equation (1)iFitted line lFiLength of (len)FiAnd equation (5) calculate the minimum bounding rectangle gRjLength len of diagonal lineRiAnd judge the relationship between the two lengths, if lenFi<lenRiThen the lane boundary line is determined to be the first noisy lane line because the length of the lane line is greater than the length of the guide line according to the a priori knowledge.
Figure BDA0002486871600000121
Figure BDA0002486871600000122
Fig. 6 is a schematic diagram illustrating a size relationship between the lane boundary line and the road surface guide line in the first implementation manner of step S304, and as shown in fig. 6, the lane boundary line 61 is determined as the first noise lane line because the length of the lane boundary line 61 is smaller than the length of each road surface guide line 62.
The implementation mode includes two specific implementation steps S3043 and S3044:
step S3043 determines the vertical distance between each lane boundary line and each road surface guide line.
In step S3044, if the vertical distance between the lane boundary line and the road surface guide line is smaller than the first preset distance threshold, the lane boundary line is determined as the first noise lane line.
Optionally, the specific method for determining the first noisy lane line includes:
calculating g by equation (3)jEach point g onj(xk,yk) The straight line formed is gj(yk) And a straight line lFiUpper intersection point (gl)ikx,gj(yk) And whether (g) is satisfied is judgedj(xk)-glikx)>thresh1If yes, then count the number of points variable num satisfying the relationshipj+1. When all the points are counted, num if anyjGreater than thresh2And if the lane boundary line is too close to or overlapped with the road surface guide line and does not accord with the position relation between the lane boundary line and the road surface guide line, filtering the lane boundary line, and otherwise, reserving.
Figure BDA0002486871600000123
Optionally, thresh1Is a first predetermined distance threshold, thresh2For number threshold, optionally thresh1=30;thresh2=5。
Fig. 7 is a schematic diagram illustrating a positional relationship between the lane boundary line and the road surface guide line in the second implementation manner of step S304, and as shown in fig. 7, the lane line b is determined as the first noise lane line because the vertical distance between the lane boundary line 71 and the road surface guide line 72 is smaller than the first preset distance threshold.
Optionally, the reference comprises a road stop line and the predicted lane line comprises more than one lane boundary line.
In step S305, a second noisy lane line is determined from the relationship between the lane boundary line and the road surface stop line.
Specifically, the road surface stop line is the top end of a lane composed of lane lines, and the automobile runs in the lane of the road surface and stops in front of the road surface stop line when the intersection is red. When the vehicle is judged to be illegally driven, the target road surface image acquired by the image acquisition device is mainly used for judging illegal driving before a road surface stop line, and the target road surface exceeding one side of the road surface stop line is relatively far away and relatively low in definition, so that the image acquisition device on the other side is used for acquiring images. That is, the lane boundary line that exceeds the road surface stop line does not function to judge the illegal travel of the vehicle, and therefore, the second noise lane line can be determined without conforming to the relationship between the lane boundary line and the road surface stop line.
Optionally, as shown in fig. 8, step S305 includes two specific implementation steps S3051 and S3052:
and step S3051, determining a target road surface area and a non-target road surface area in the target road surface image according to the road surface stop line.
Specifically, the target road surface area, i.e., the road surface area on the side of the road surface stop line close to the image acquisition device; the non-target road surface area is the road surface area on one side of the road surface stop line principle image acquisition device.
In step S3052, if the lane boundary is in the non-target road surface region, the lane boundary is determined as a second noise lane line.
Specifically, the position of the lane boundary line is determined, and if the position is in the non-target road surface region, that is, the lane boundary line is the lane boundary line of the opposite intersection and does not belong to the region monitored by the image acquisition device on the own side, the lane boundary line is determined as the second noise lane line.
Fig. 9 is a schematic diagram showing the relationship between the lane boundary line and the road surface stop line in step S305, and as shown in fig. 9, the lane boundary line 91 is located in the non-target road surface region at the distal end of the road surface stop line 92, and therefore the lane boundary line 91 is determined as the second noisy lane line.
Optionally, the reference object includes a vanishing point, and the predicted lane line includes more than two lane boundary lines.
And step S306, determining a third noise lane line according to the relation between the lane boundary line and the vanishing point.
According to the perspective principle, the parallel lane boundary lines intersect at a point, i.e., a vanishing point, in the two-dimensional image. Since a certain reference relationship exists between the lane boundary lines and the vanishing points, and the vanishing points are intersections where different lane boundary lines extend, a plurality of vanishing points can be determined from the predicted lane boundary lines of two or more lane lines. Since the plurality of lane boundaries on the road are all parallel based on the prior knowledge, the vanishing points of the plurality of lane boundaries have a certain reference relationship, and the lane boundary which does not conform to the certain reference relationship is the third noise lane line.
Optionally, as shown in fig. 10, step S306 includes two specific implementation steps S3061 and S3062:
step S3061, a center vanishing point is determined from the positions of the vanishing points.
In the case where an intersection will have N lane boundaries, there are N ═ nx (N-1)/2 vanishing points. Alternatively, the pixel coordinates of the N vanishing points are averaged, and a central vanishing point based on the N vanishing points can be obtained.
Optionally, to further improve the accuracy of the central vanishing point, the vanishing points with larger position deviation may be removed before the central vanishing point is determined, and then the vanishing points with smaller residual deviation are averaged, and the central vanishing point is determined, so as to improve the robustness in determining the central vanishing point in the step of the present embodiment.
In step S3062, if the vertical distance between the lane boundary line and the center vanishing point is greater than the second preset distance threshold, the lane boundary line is determined as a third noise lane line.
According to the priori knowledge, because the boundary lines of all lanes are in a parallel state on the real target road surface, vanishing points of the boundary lines of all lanes in the target road surface image should be gathered in a close region, the vertical distance between the boundary lines of the lanes and the central vanishing point is calculated, and whether the boundary lines of the lanes and other lane lines are in a parallel state on the real target road surface can be judged according to the vertical distance. If the vertical distance between the lane boundary line and the center vanishing point is larger than a second preset distance threshold, the lane boundary line and the other lane boundaries are not in a parallel state, and the lane boundary line can be determined to be a third noise lane line.
Fig. 11 is a schematic diagram illustrating the relationship between the lane boundary line and the vanishing point in step S306, and as shown in fig. 11, the vertical distance 113 between the lane boundary line 111 and the central vanishing point 112 is greater than the second preset distance threshold, so that the lane boundary line 111 is determined as the third noise lane line.
In step S307, a noise lane line in the predicted lane lines is removed to obtain an actual lane line.
The noise lane line comprises at least one of a first noise lane line, a second noise lane line and a third noise lane line.
In the step of this embodiment, through steps S304 to S306, the first noise lane line, the second noise lane line, and the third noise lane line in the predicted lane lines are determined sequentially by the reference object and are filtered out from the predicted lane lines, so that the influence of the wrong predicted lane line on the illegal driving judgment is avoided, and the accuracy and the timeliness of the lane line identification method provided by this embodiment are improved.
In step S308, the actual lane line is determined as the target road lane line.
In step S309, the target road lane line is saved as a structural file corresponding to the target road.
Optionally, after the target road lane line is acquired, the target road lane line is saved as a structural file corresponding to the target road, and is sent to the server for saving. When illegal driving judgment is carried out according to the target road surface lane line, the structural file can be directly called for use, the accuracy and the timeliness of illegal driving judgment of the vehicle are improved, and the structural file can be transplanted to a similar road surface area for use, so that the use flexibility is improved.
In this embodiment, the implementation manners of step S301 to step S303 and step S307 to step S308 are the same as the implementation manners of step S201 to step S203 and step S205 to step S206 in the embodiment shown in fig. 3 of the present invention, and are not described again.
Fig. 12 is a schematic structural view of a road lane line recognition device according to an embodiment of the present invention, and as shown in fig. 12, the road lane line recognition device 4 according to the embodiment includes:
and an image obtaining module 41, configured to obtain a target road surface image.
And the image extraction module 42 is used for performing image segmentation extraction on the target road surface image to obtain a predicted lane line and a reference object.
And a lane line correction module 43, configured to correct the predicted lane line by using the reference object to obtain an actual lane line.
And the lane line determining module 44 is used for determining the actual lane line as the target road lane line.
Optionally, the image extraction module 42 is specifically configured to:
and acquiring a road surface feature recognition model trained to be convergent.
And carrying out segmentation and extraction on the target road surface image through a road surface feature recognition model so as to obtain a predicted lane line and a reference object.
Optionally, the reference comprises at least one road guide line and a road stop line; an image extraction module 42, further configured to:
and obtaining a training sample of the road surface characteristic identification model, wherein the training sample is a training road surface image, and the training road surface image comprises marked lane lines, road surface guide lines and road surface stop lines.
And training the initial road surface feature recognition model by using the training sample until a preset model convergence condition is met, so as to obtain a road surface feature recognition model trained to be converged.
Optionally, the lane line correction module 43 is specifically configured to:
and determining the noise lane line according to the relation between the predicted lane line and the reference object.
And removing the noise lane lines in the predicted lane lines to obtain actual lane lines.
Optionally, the reference object includes at least one road surface guide line, the predicted lane line includes at least one lane boundary line, and the lane line modification module 43 is specifically configured to, when determining the noise lane line according to a relationship between the predicted lane line and the reference object:
and determining a first noise lane line according to the relation between the lane boundary line and the road surface guide line.
Optionally, the lane line correction module 43, when determining the first noise lane line according to a relationship between the lane boundary line and the road surface guide line, is specifically configured to:
the length of the lane boundary line and the length of the road surface guide line are acquired.
And if the length of the lane boundary line is less than that of the shortest road guide line in the road guide lines, determining the lane boundary line as a first noise lane line.
Optionally, the lane line correction module 43, when determining the first noise lane line according to a relationship between the lane boundary line and the road surface guide line, is specifically configured to:
and determining the vertical distance between each lane boundary line and each road surface guide line.
And if the vertical distance between the lane boundary line and a certain road surface guide line is smaller than a first preset distance threshold value, determining the lane boundary line as a first noise lane line.
Optionally, the reference object includes a road stop line, the predicted lane line includes more than one lane boundary line, and the lane line modification module 43 is specifically configured to, when determining the noise lane line according to a relationship between the predicted lane line and the reference object:
and determining a second noise lane line according to the relation between the lane boundary line and the road surface stop line.
Optionally, when determining the second noise lane line according to the relationship between the lane boundary line and the road stop line, the lane line correction module 43 is specifically configured to:
and determining a target road surface area and a non-target road surface area in the target road surface image according to the road surface stop line.
And if the lane boundary line is in the non-target road surface area, determining the lane boundary line as a second noise lane line.
Optionally, the reference object includes a vanishing point, the predicted lane line includes more than two lane boundary lines, and the lane line modification module 43 is specifically configured to, when determining the noise lane line according to a relationship between the predicted lane line and the reference object:
and determining a third noise lane line according to the relationship between the lane boundary line and the vanishing point.
Optionally, when determining the third noise lane line according to the relationship between the lane boundary line and the vanishing point, the lane line correction module 43 is specifically configured to:
and determining the central vanishing point according to the positions of the vanishing points.
And if the vertical distance between the lane boundary line and the center vanishing point is greater than a second preset distance threshold, determining the lane boundary line as a third noise lane line.
Optionally, the lane line determination module 44 is further configured to:
and saving the target road surface lane line as a structural file corresponding to the target road surface.
The image acquisition module 41, the image extraction module 42, the lane line correction module 43, and the lane line determination module 44 are connected in sequence. The lane line recognition device 4 provided in this embodiment may perform the lane line recognition method provided in any one of the embodiments corresponding to fig. 2 to fig. 11, and the implementation principle and technical effect are similar, and are not described herein again.
Fig. 13 is a schematic view of an electronic device according to an embodiment of the present invention, and as shown in fig. 13, the electronic device according to the embodiment includes: a memory 51, a processor 52 and a computer program.
The computer program is stored in the memory 51 and configured to be executed by the processor 52 to implement the method for identifying a lane line on a road surface according to any one of the embodiments corresponding to fig. 2 to 11 of the present invention.
The memory 51 and the processor 52 are connected by a bus 53.
The relevant descriptions and effects corresponding to the steps in the embodiments corresponding to fig. 2 to fig. 11 can be understood, and are not described in detail herein.
One embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the method for identifying a lane line on a road surface according to any one of the embodiments corresponding to fig. 2 to 11 of the present invention.
The computer readable storage medium may be, among others, ROM, Random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of modules is merely a division of logical functions, and an actual implementation may have another division, for example, a plurality of modules or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or modules, and may be in an electrical, mechanical or other form.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This invention is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (15)

1. A method for identifying a lane line on a road surface, the method comprising:
acquiring a target pavement image;
carrying out image segmentation and extraction on the target pavement image to obtain a predicted lane line and a reference object;
correcting the predicted lane line by using the reference object to obtain an actual lane line;
and determining the actual lane line as the target road lane line.
2. The method according to claim 1, wherein the image segmentation and extraction of the target road surface image to obtain a predicted lane line and a reference object comprises:
acquiring a road surface feature recognition model trained to be convergent;
and carrying out segmentation and extraction on the target road surface image through the road surface feature recognition model so as to obtain a predicted lane line and a reference object.
3. The method of claim 2, wherein the reference comprises at least one of a road guide line and a road stop line; before the obtaining of the road surface feature recognition model trained to converge, the method further includes:
obtaining a training sample of a road surface feature recognition model, wherein the training sample is a training road surface image, and the training road surface image comprises marked lane lines, road surface guide lines and road surface stop lines;
and training an initial road surface feature recognition model by using the training sample until a preset model convergence condition is met, so as to obtain the road surface feature recognition model trained to be converged.
4. The method of claim 1, wherein the correcting the predicted lane line using the reference to obtain an actual lane line comprises:
determining a noise lane line according to the relation between the predicted lane line and the reference object;
and removing the noise lane line in the predicted lane line to obtain the actual lane line.
5. The method of claim 4, wherein the reference object comprises at least one road surface guide line, the predicted lane line comprises at least one lane boundary line, and determining a noisy lane line based on the relationship of the predicted lane line to the reference object comprises:
and determining a first noise lane line according to the relation between the lane boundary line and the road surface guide line.
6. The method of claim 5, wherein determining a first noisy lane line based on a relationship of said lane boundary line and said road surface guide line comprises:
acquiring the length of a lane boundary line and the length of a road surface guide line;
and if the length of the lane boundary line is smaller than that of the shortest road guide line in the road surface guide lines, determining the lane boundary line as a first noise lane line.
7. The method of claim 5, wherein determining a first noisy lane line based on a relationship of said lane boundary line and said road surface guide line comprises:
determining the vertical distance between each lane boundary line and each road surface guide line;
and if the vertical distance between the lane boundary line and the road surface guide line is smaller than a first preset distance threshold, determining the lane boundary line as a first noise lane line.
8. The method of claim 4, wherein the reference comprises a road stop line, the predicted lane line comprises more than one lane boundary line, and determining a noisy lane line based on the relationship of the predicted lane line to the reference comprises:
and determining a second noise lane line according to the relation between the lane boundary line and the road surface stop line.
9. The method of claim 8, wherein determining a second noisy lane line based on a relationship of the lane boundary line and the road stop line comprises:
determining a target road surface area and a non-target road surface area in the target road surface image according to the road surface stop line;
and if the lane boundary line is in the non-target road surface area, determining the lane boundary line as a second noise lane line.
10. The method of claim 4, wherein the reference includes a vanishing point, the predicted lane line includes more than two lane boundary lines, and determining a noisy lane line based on a relationship of the predicted lane line and the reference includes:
and determining a third noise lane line according to the relation between the lane boundary line and the vanishing point.
11. The method of claim 10, wherein determining a third noisy lane line based on the relationship of the lane boundary line to the vanishing point comprises:
determining a central vanishing point according to the position of each vanishing point;
and if the vertical distance between the lane boundary line and the center vanishing point is greater than a second preset distance threshold, determining the lane boundary line as a third noise lane line.
12. The method according to any one of claims 1 to 11, further comprising, after determining the actual lane line as the target road lane line:
and saving the target road surface lane line as a structural file corresponding to the target road surface.
13. A pavement lane line identification apparatus, characterized in that the apparatus comprises:
the image acquisition module is used for acquiring a target road surface image;
the image extraction module is used for carrying out image segmentation extraction on the target road surface image so as to obtain a predicted lane line and a reference object;
the lane line correction module is used for correcting the predicted lane line by using the reference object so as to obtain an actual lane line;
and the lane line determining module is used for determining the actual lane line as the target road lane line.
14. An electronic device, comprising: a memory, a processor, and a computer program;
wherein the computer program is stored in the memory and configured to be executed by the processor to implement the method of road lane marking identification according to any of claims 1-12.
15. A computer-readable storage medium having stored thereon computer-executable instructions for implementing the method of identifying a lane line according to any one of claims 1 to 12 when executed by a processor.
CN202010393718.7A 2020-05-11 2020-05-11 Method and device for identifying road lane lines, electronic equipment and storage medium Pending CN111563463A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010393718.7A CN111563463A (en) 2020-05-11 2020-05-11 Method and device for identifying road lane lines, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010393718.7A CN111563463A (en) 2020-05-11 2020-05-11 Method and device for identifying road lane lines, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111563463A true CN111563463A (en) 2020-08-21

Family

ID=72073369

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010393718.7A Pending CN111563463A (en) 2020-05-11 2020-05-11 Method and device for identifying road lane lines, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111563463A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112132109A (en) * 2020-10-10 2020-12-25 北京百度网讯科技有限公司 Lane line processing and lane positioning method, device, equipment and storage medium
CN112507852A (en) * 2020-12-02 2021-03-16 上海眼控科技股份有限公司 Lane line identification method, device, equipment and storage medium
CN112990087A (en) * 2021-04-08 2021-06-18 济南博观智能科技有限公司 Lane line detection method, device, equipment and readable storage medium
CN113392793A (en) * 2021-06-28 2021-09-14 北京百度网讯科技有限公司 Method, device, equipment, storage medium and unmanned vehicle for identifying lane line
CN113591730A (en) * 2021-08-03 2021-11-02 湖北亿咖通科技有限公司 Method, device and equipment for recognizing lane grouping line
CN113689718A (en) * 2021-08-13 2021-11-23 吉林大学 Intelligent signal lamp and lane matching system and method
CN115049997A (en) * 2022-06-07 2022-09-13 北京百度网讯科技有限公司 Method and device for generating edge lane line, electronic device and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108734105A (en) * 2018-04-20 2018-11-02 东软集团股份有限公司 Method for detecting lane lines, device, storage medium and electronic equipment
CN109284674A (en) * 2018-08-09 2019-01-29 浙江大华技术股份有限公司 A kind of method and device of determining lane line
CN109583280A (en) * 2017-09-29 2019-04-05 比亚迪股份有限公司 Lane detection method, apparatus, equipment and storage medium
CN109637151A (en) * 2018-12-31 2019-04-16 上海眼控科技股份有限公司 A kind of recognition methods that highway Emergency Vehicle Lane is driven against traffic regulations
CN109948418A (en) * 2018-12-31 2019-06-28 上海眼控科技股份有限公司 A kind of illegal automatic auditing method of violation guiding based on deep learning
CN109949578A (en) * 2018-12-31 2019-06-28 上海眼控科技股份有限公司 A kind of illegal automatic auditing method of vehicle crimping based on deep learning
CN111046829A (en) * 2019-12-23 2020-04-21 上海交通大学 Online lane-level positioning method and system based on prior reasoning

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109583280A (en) * 2017-09-29 2019-04-05 比亚迪股份有限公司 Lane detection method, apparatus, equipment and storage medium
CN108734105A (en) * 2018-04-20 2018-11-02 东软集团股份有限公司 Method for detecting lane lines, device, storage medium and electronic equipment
CN109284674A (en) * 2018-08-09 2019-01-29 浙江大华技术股份有限公司 A kind of method and device of determining lane line
CN109637151A (en) * 2018-12-31 2019-04-16 上海眼控科技股份有限公司 A kind of recognition methods that highway Emergency Vehicle Lane is driven against traffic regulations
CN109948418A (en) * 2018-12-31 2019-06-28 上海眼控科技股份有限公司 A kind of illegal automatic auditing method of violation guiding based on deep learning
CN109949578A (en) * 2018-12-31 2019-06-28 上海眼控科技股份有限公司 A kind of illegal automatic auditing method of vehicle crimping based on deep learning
CN111046829A (en) * 2019-12-23 2020-04-21 上海交通大学 Online lane-level positioning method and system based on prior reasoning

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112132109A (en) * 2020-10-10 2020-12-25 北京百度网讯科技有限公司 Lane line processing and lane positioning method, device, equipment and storage medium
CN112507852A (en) * 2020-12-02 2021-03-16 上海眼控科技股份有限公司 Lane line identification method, device, equipment and storage medium
CN112990087A (en) * 2021-04-08 2021-06-18 济南博观智能科技有限公司 Lane line detection method, device, equipment and readable storage medium
CN112990087B (en) * 2021-04-08 2022-08-19 济南博观智能科技有限公司 Lane line detection method, device, equipment and readable storage medium
CN113392793A (en) * 2021-06-28 2021-09-14 北京百度网讯科技有限公司 Method, device, equipment, storage medium and unmanned vehicle for identifying lane line
CN113591730A (en) * 2021-08-03 2021-11-02 湖北亿咖通科技有限公司 Method, device and equipment for recognizing lane grouping line
CN113591730B (en) * 2021-08-03 2023-11-10 湖北亿咖通科技有限公司 Method, device and equipment for identifying lane grouping lines
CN113689718A (en) * 2021-08-13 2021-11-23 吉林大学 Intelligent signal lamp and lane matching system and method
CN113689718B (en) * 2021-08-13 2022-09-13 吉林大学 Intelligent signal lamp and lane matching system and method
CN115049997A (en) * 2022-06-07 2022-09-13 北京百度网讯科技有限公司 Method and device for generating edge lane line, electronic device and storage medium

Similar Documents

Publication Publication Date Title
CN111563463A (en) Method and device for identifying road lane lines, electronic equipment and storage medium
JP6347815B2 (en) Method, apparatus and device for detecting lane boundaries
CN106652465B (en) Method and system for identifying abnormal driving behaviors on road
US8750567B2 (en) Road structure detection and tracking
JP4822766B2 (en) Road marking recognition device and system
US8305445B2 (en) Lane marker recognizing apparatus
Lee et al. A lane-departure identification based on LBPE, Hough transform, and linear regression
CN105718870A (en) Road marking line extracting method based on forward camera head in automatic driving
CN110188606B (en) Lane recognition method and device based on hyperspectral imaging and electronic equipment
CN107180230B (en) Universal license plate recognition method
CN109948552B (en) Method for detecting lane line in complex traffic environment
Youjin et al. A robust lane detection method based on vanishing point estimation
CN111126323A (en) Bayonet element recognition and analysis method and system serving for traffic violation detection
CN111488808B (en) Lane line detection method based on traffic violation image data
CN112990087B (en) Lane line detection method, device, equipment and readable storage medium
CN109635737A (en) Automobile navigation localization method is assisted based on pavement marker line visual identity
CN101369312B (en) Method and equipment for detecting intersection in image
CN107918775B (en) Zebra crossing detection method and system for assisting safe driving of vehicle
JP3589293B2 (en) Road white line detection method
Xuan et al. Robust lane-mark extraction for autonomous driving under complex real conditions
Wennan et al. Lane detection in some complex conditions
CN116682268A (en) Portable urban road vehicle violation inspection system and method based on machine vision
CN115984772A (en) Road ponding detection method and terminal based on video monitoring
Merugu et al. Multi lane detection, curve fitting and lane type classification
KR100976142B1 (en) detection method of road vehicles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination