CN111814746A - Method, device, equipment and storage medium for identifying lane line - Google Patents

Method, device, equipment and storage medium for identifying lane line Download PDF

Info

Publication number
CN111814746A
CN111814746A CN202010789662.7A CN202010789662A CN111814746A CN 111814746 A CN111814746 A CN 111814746A CN 202010789662 A CN202010789662 A CN 202010789662A CN 111814746 A CN111814746 A CN 111814746A
Authority
CN
China
Prior art keywords
lane line
lane
attribute parameters
frame
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010789662.7A
Other languages
Chinese (zh)
Inventor
陈佳腾
庄伯金
王少军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN202010789662.7A priority Critical patent/CN111814746A/en
Priority to PCT/CN2020/123251 priority patent/WO2021151321A1/en
Publication of CN111814746A publication Critical patent/CN111814746A/en
Priority to CN202011204772.9A priority patent/CN112200142A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the invention provides a method, a device, equipment and a storage medium for identifying lane lines, wherein the method comprises the following steps: acquiring a video, and determining attribute parameters of a lane line according to video frame information in the video, wherein the attribute parameters of the lane line comprise the type of the lane line, the stability of the lane line and the position of the lane line; determining attribute parameters of the historical frame lane lines and attribute parameters of the future frame lane lines according to the attribute parameters of the lane lines; and determining the attribute parameters of the current frame lane line according to the attribute parameters of the historical frame lane line and the attribute parameters of the future frame lane line. The invention relates to the technical field of image processing, and can make the judgment of the type of a lane line more accurate, make the output result of the lane line more stable and reduce the condition of sudden change of the type by jointly reasoning the attribute of the lane line of the current frame by utilizing the time sequence attribute of the front frame and the rear frame of a video and the output of a segmentation model.

Description

Method, device, equipment and storage medium for identifying lane line
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a method, an apparatus, a device, and a storage medium for identifying lane lines.
Background
The detection and identification of the lane line are indispensable components in the fields of automatic driving, intelligent traffic violation snapshot and the like. The accurate detection of the lane line can assist the vehicle to automatically run in a legal lane, and the intelligent traffic violation snapshot is used for realizing lane change and the like, so that the requirement on accurate identification of the category of the lane line is very high. The currently common lane line identification method utilizes a semantic segmentation model to obtain a mask of a lane line, and determines the type of the lane line according to the category of pixels in the mask area.
In the prior art, the number of pixel points to be processed is reduced by extracting candidate feature points representing the central line of the lane line. By analyzing the characteristic points of the lane lines and the noise, a method for filtering the pseudo characteristic points irrelevant to the parameter space is provided, the use of parameter space conversion and complex mathematical operation is avoided, and the accuracy of lane line detection is improved. And a lane line prediction model is provided for predicting lane lines of special road conditions, the model stores internal memory, and the lane lines under the special road conditions are predicted according to the change trend of the image sequence from the past to the present.
At present, the detection reliability of the lane line is low, and irreparable errors are generated in the application of the lane line. In addition, the lane line can be shielded by the barrier in the real scene, so that the lane line information cannot be accurately obtained, and the lane line information is unstable in the identification process.
Disclosure of Invention
In view of the above problems, embodiments of the present invention are proposed to provide a method, apparatus, device and storage medium for identifying a lane line that overcome or at least partially solve the above problems,
a method of identifying a lane line, comprising:
acquiring a video, and determining attribute parameters of a lane line according to video frame information in the video, wherein the attribute parameters of the lane line comprise the type of the lane line, the stability of the lane line and the position of the lane line;
determining attribute parameters of the historical frame lane lines and attribute parameters of the future frame lane lines according to the attribute parameters of the lane lines;
and determining the attribute parameters of the current frame lane line according to the attribute parameters of the historical frame lane line and the attribute parameters of the future frame lane line.
Preferably, the step of determining the attribute parameters of the lane line according to the video frame information in the video includes:
identifying the category of the lane line, and identifying the lane line as a solid line or a dotted line according to video frame information in the video;
identifying the stability of the lane line, and determining whether the stability of the lane line reaches a preset value according to video frame information in the video;
and identifying the position of the lane line, and detecting the position information of the lane line according to the video frame information in the video.
Preferably, the step of identifying the stability of the lane line and determining whether the stability of the lane line reaches a preset value according to video frame information in the video includes:
if the type of the lane line is a solid line, when the far end of the lane line disappears on the horizon line and/or the lane line continuously exists during the movement of the vehicle and/or the length of the lane line reaches a preset value, setting the stability mark as stable, and otherwise, setting the stability of the lane line as unstable;
if the type of the lane line is a dotted line, no obstacle is arranged in front of the lane line and/or the lane line disappears when the vehicle continuously moves, the stability of the lane line is set to be determined, and otherwise, the stability of the lane line is set to be uncertain.
Preferably, the step of determining the attribute parameter of the current frame lane line according to the attribute parameter of the historical frame lane line and the attribute parameter of the future frame lane line includes:
determining the attribute parameters of a first lane line by using the attribute parameters of the historical frame lane line;
and correcting the attribute parameters of the first lane line according to the attributes of the future frame lane line to determine the attribute parameters of the current frame lane line.
Preferably, the step of determining the attribute parameter of the first lane line by using the attribute parameter of the lane line of the history frame includes:
determining the stability of the first lane line and the position of the first lane line according to video frame information in the video;
determining the category of the first lane line according to the category of the historical frame lane line to obtain the attribute parameters of the first lane line.
Preferably, the step of determining the attribute parameter of the current frame lane line by correcting the attribute parameter of the first lane line according to the attribute of the future frame lane line includes:
and if the stability of the future frame lane line is stable and the stability of the first lane line is unstable, correcting the type of the first lane line and the stability of the first lane line according to the type of the future frame lane line and the stability of the future frame lane line to obtain the attribute parameters of the current frame lane line.
The attribute parameters of the current frame lane line comprise: the type of the current frame lane line, the stability of the current frame lane line and the position of the current frame lane line.
In addition, to achieve the above object, the present application also provides a lane line recognition apparatus comprising:
an attribute module: the system comprises a video acquisition module, a lane line detection module, a lane line display module and a lane line display module, wherein the video acquisition module is used for acquiring a video and determining attribute parameters of the lane line according to video frame information in the video, and the attribute parameters of the lane line comprise the type of the lane line, the stability of the lane line and the position of the lane line;
a segmentation module: the attribute parameters of the historical frame lane lines and the attribute parameters of the future frame lane lines are determined according to the attribute parameters of the lane lines;
a determination module: and the attribute parameters of the current frame lane line are determined according to the attribute parameters of the historical frame lane line and the attribute parameters of the future frame lane line.
The application also provides electronic equipment for identifying the lane line, which comprises a processor, a memory and a computer program which is stored on the memory and can run on the processor, and the steps of the vehicle detection method are realized.
The present application further provides a computer-readable storage medium for identifying lane lines, on which a computer program is stored, implementing the steps of the vehicle detection method.
In order to solve the above problem, an embodiment of the present invention discloses a method for identifying a lane line, including:
in addition, the present invention also provides a device for identifying a lane line, comprising:
embodiments of the present invention include an electronic device, comprising a processor, a memory, and a computer program stored on the memory and capable of running on the processor, wherein the computer program, when executed by the processor, implements the steps of the above-described method for identifying lane lines.
Embodiments of the present invention include a computer-readable storage medium having stored thereon a computer program, which, when executed by a processor, implements the steps of the above-described method of identifying lane lines.
The method comprises the steps of obtaining a video, and determining attribute parameters of a lane line according to video frame information in the video, wherein the attribute parameters of the lane line comprise the type of the lane line, the stability of the lane line and the position of the lane line; determining attribute parameters of the historical frame lane lines and attribute parameters of the future frame lane lines according to the attribute parameters of the lane lines; and determining the attribute parameters of the current frame lane line according to the attribute parameters of the historical frame lane line and the attribute parameters of the future frame lane line. The method has the following advantages: by jointly reasoning the attribute of the lane line of the current frame by utilizing the time sequence attribute of the front frame and the rear frame of the video and the output of the segmentation model, the judgment of the category of the lane line can be more accurate. The accuracy of the current attribute is guaranteed by combining the historical attribute and the future attribute. After the time sequence attributes of the front frame and the rear frame are utilized, the output result of the lane line is more stable, and the condition of sudden change of the category is reduced.
Drawings
FIG. 1 is a flow chart of the steps of one embodiment of a method of identifying a lane line of the present invention;
FIG. 2 is a block diagram of an embodiment of a lane-line-identifying device according to the present invention;
FIG. 3 schematically illustrates a flow chart of steps of an embodiment of a method of identifying lane lines of the present application;
FIG. 4 is a flow chart that schematically illustrates the steps of one embodiment of a method of identifying lane lines, in accordance with the present application;
fig. 5 is an electronic device implementing the vehicle detection method of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Referring to fig. 1, a flowchart illustrating steps of an embodiment of a method for identifying a lane line according to the present invention is shown, which specifically includes the following steps:
step S100, acquiring a video, and determining attribute parameters of a lane line according to video frame information in the video, wherein the attribute parameters of the lane line comprise the type of the lane line, the stability of the lane line and the position of the lane line;
determining attribute parameters of the lane lines according to video frame information in the video, wherein the step comprises the following steps:
identifying the category of the lane line, and identifying the lane line as a solid line or a dotted line according to video frame information in the video;
identifying the stability of the lane line, and determining whether the stability of the lane line reaches a preset value according to video frame information in the video; the step of identifying the stability of the lane line and determining whether the stability of the lane line reaches a preset value according to the video frame information in the video comprises the following steps:
if the type of the lane line is a solid line, when the far end of the lane line disappears on the horizon line and/or the lane line continuously exists during the movement of the vehicle and/or the length of the lane line reaches a preset value, setting the stability mark as stable, and otherwise, setting the stability of the lane line as unstable;
if the type of the lane line is a dotted line, no obstacle is arranged in front of the lane line and/or the lane line disappears when the vehicle continuously moves, the stability of the lane line is set to be stable, and otherwise, the stability of the lane line is set to be unstable.
And identifying the position of the lane line, and detecting the position information of the lane line according to the video frame information in the video.
Step S200, determining attribute parameters of the lane lines of the historical frames and attribute parameters of the lane lines of the future frames according to the attribute parameters of the lane lines;
the step of determining the attribute parameters of the current frame lane line according to the attribute parameters of the historical frame lane line and the attribute parameters of the future frame lane line comprises the following steps:
determining the attribute parameters of a first lane line by using the attribute parameters of the historical frame lane line;
and correcting the attribute parameters of the first lane line according to the attributes of the future frame lane line to determine the attribute parameters of the current frame lane line.
The step of determining the attribute parameter of the first lane line by using the attribute parameter of the history frame lane line includes:
determining the stability of the first lane line and the position of the first lane line according to video frame information in the video;
determining the category of the first lane line according to the category of the historical frame lane line to obtain the attribute parameters of the first lane line.
Step S300, determining the attribute parameters of the current frame lane line according to the attribute parameters of the historical frame lane line and the attribute parameters of the future frame lane line.
The step of determining the attribute parameter of the current frame lane line by correcting the attribute parameter of the first lane line according to the attribute of the future frame lane line includes:
and if the stability of the future frame lane line is stable and the stability of the first lane line is unstable, correcting the type of the first lane line and the stability of the first lane line according to the type of the future frame lane line and the stability of the future frame lane line to obtain the attribute parameters of the current frame lane line.
One of the core ideas of the embodiment of the invention is that the judgment of the category of the lane line can be more accurate and the robustness can be increased by jointly reasoning the standard lane line by utilizing the time sequence information before and after the video and the output of the segmentation model. Especially in the case of the solid line lane change law violation snapshot, if the dotted line is mistakenly recognized as the solid line, a wrong penalty is caused. When the dotted line is blocked by an obstacle, the segmentation model can easily identify the dotted line as a solid line, so that the combination of the attribute of the lane line of the historical frame and the attribute of the lane line of the future frame is very important. In addition, after the front and rear time sequence information is utilized, the output result of the lane line is more stable, and the condition of sudden change of the category is reduced.
The attribute parameters of the current frame lane line comprise: the type of the current frame lane line, the stability of the current frame lane line and the position of the current frame lane line.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 2, a block diagram of an embodiment of a lane line recognition device according to the present invention is shown, and may specifically include the following modules:
an apparatus for identifying a lane line, comprising:
the attribute module 100: the method and the device are used for acquiring a video and determining attribute parameters of a lane line according to video frame information in the video, wherein the attribute parameters of the lane line comprise the type of the lane line, the stability of the lane line and the position of the lane line.
The segmentation module 200: and the attribute parameters of the historical frame lane lines and the attribute parameters of the future frame lane lines are determined according to the attribute parameters of the lane lines.
The determination module 300: and the attribute parameters of the current frame lane line are determined according to the attribute parameters of the historical frame lane line and the attribute parameters of the future frame lane line.
The segmentation module 200 is configured to determine an attribute parameter of a lane line in a historical frame and an attribute parameter of a lane line in a future frame according to the attribute parameter of the lane line, and further includes:
a category module: the method is used for identifying the type of the lane line, and identifying the lane line as a solid line or a dotted line according to video frame information in the video.
A stabilization module: the stability detection module is used for identifying the stability of the lane line and determining whether the stability of the lane line reaches a preset value according to video frame information in the video.
A position module: the system is used for identifying the position of the lane line and detecting the position information of the lane line according to the video frame information in the video.
Wherein, the stabilizing module includes:
solid line stabilization module: and if the type of the lane line is a solid line, setting the stability mark as definite when the far end of the lane line disappears on the horizon line and/or the lane line continuously exists when the vehicle moves and/or the length of the lane line reaches a preset value, and setting the stability of the lane line as uncertain under other conditions.
A dashed line stabilization module: and if the type of the lane line is a dotted line, no obstacle is arranged in front of the lane line and/or the lane line disappears when the vehicle continuously moves, setting the stability of the lane line as determined, and otherwise, setting the stability of the lane line as uncertain.
Wherein, the determining module 300 includes:
a first lane line module: the system comprises a video processing unit, a first road line and a second road line, wherein the video processing unit is used for determining the stability of the first road line and the position of the first road line according to video frame information in the video;
the current lane line module: the method is used for determining the category of the first lane line according to the category of the historical frame lane line to obtain the attribute parameter of the first lane line.
Wherein, current lane line module includes:
a correction module: and if the stability of the future frame lane line is stable and the stability of the first lane line is unstable, correcting the type of the first lane line and the stability of the first lane line according to the type of the future frame lane line and the stability of the future frame lane line to obtain the attribute parameters of the current frame lane line.
The method comprises the steps of obtaining a video, and determining attribute parameters of a lane line according to video frame information in the video, wherein the attribute parameters of the lane line comprise the type of the lane line, the stability of the lane line and the position of the lane line; determining attribute parameters of the historical frame lane lines and attribute parameters of the future frame lane lines according to the attribute parameters of the lane lines; and determining the attribute parameters of the current frame lane line according to the attribute parameters of the historical frame lane line and the attribute parameters of the future frame lane line. The method has the following advantages: by jointly reasoning the attribute of the lane line of the current frame by utilizing the time sequence attribute of the front frame and the rear frame of the video and the output of the segmentation model, the judgment of the category of the lane line can be more accurate. The accuracy of the current attribute is guaranteed by combining the historical attribute and the future attribute. After the time sequence attributes of the front frame and the rear frame are utilized, the output result of the lane line is more stable, and the condition of sudden change of the category is reduced.
Referring to fig. 3, a flowchart schematically illustrating steps of an embodiment of the method for identifying a lane line according to the present application includes the following steps:
step S101, identifying the type of the lane line, and identifying the lane line as a solid line or a dotted line according to video frame information in the video;
step S102, identifying the stability of the lane line, and determining whether the stability of the lane line reaches a preset value according to video frame information in the video;
step S103, marking the position of the lane line, and detecting the position information of the lane line according to the video frame information in the video.
Referring to fig. 4, a flowchart schematically illustrating steps of an embodiment of the method for identifying a lane line according to the present application includes the following steps:
step S301, determining the stability of the first lane line and the position of the first lane line according to video frame information in the video;
step S302, determining the category of the first lane line according to the category of the lane line of the historical frame, and obtaining the attribute parameter of the first lane line.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
In the prior art, two technologies are used for detecting and identifying lane lines, one is to divide pixel classes output by a model, and the other is to treat continuous video frames as separate images, but the two methods have defects.
The method and the device have the advantages that the time sequence information of the front frame and the rear frame of the video and the output of the segmentation model are utilized to jointly infer the information of the lane line of the current frame, so that the judgment of the category of the lane line is more accurate and the robustness is improved. Especially in the case of the solid line lane change law violation snapshot, if the dotted line is mistakenly recognized as the solid line, a wrong penalty is caused. When the dotted line is blocked by an obstacle, the segmentation model can easily identify the dotted line as a solid line, so that the combination of the information of the lane lines of the historical frame and the future frame is very important. In addition, after the time sequence information of the front frame and the rear frame is utilized, the output result of the lane line is more stable, and the condition of sudden change of the category is reduced.
The detection and identification of the lane line are indispensable components in the fields of automatic driving, intelligent traffic violation snapshot auditing and the like. The accurate detection of the lane line position can assist the vehicle to automatically run in a legal lane, and the intelligent traffic violation snapshot has very high requirements on the accurate identification of the lane line type such as real-line lane change and the like. The current common lane line identification method is to obtain a mask of a lane line by using a semantic segmentation model and determine the type of the lane line according to the category of pixels in the mask region. This causes two problems:
one is that the pixel classes output by the segmentation model are often not trusted, especially the distinction between solid and dashed lines. Intuitively, the solid and dashed lines are indistinguishable from a single pixel, and forcing the type of lane line to be determined by the class of pixel often produces unpredictable errors. In addition, the lane lines are also blocked by obstacles such as vehicles in real scenes, and the types of the lane lines cannot be accurately obtained only by the output of the segmentation model.
Secondly, the input of the identification part of the lane line is usually continuous video frames, and the existing lane line identification method treats the continuous video frames as separate images, so that the time sequence information of the video frames can be ignored, and the category of the lane line is suddenly changed.
Aiming at the problems, a method for jointly reasoning the lane line category by utilizing the video time sequence information and the output of the segmentation model is provided. When the category of the current frame lane line is output, the method and the device not only rely on the output of the existing information, but also rely on the type of the historical frame lane line and the type of the future frame lane line. Specifically, another embodiment of the present application is as follows:
identifying the attribute of the lane line to obtain attribute parameters including the type of the lane line, whether the lane line is a solid line or a dotted line; identifying the stability of the lane line, whether it is in a stable or unstable state; identifying location information of the lane line. For the solid line, if a certain condition is satisfied, its status flag is set to stable. Such as the lane lines being sufficiently long; the far end of the lane line disappears at the horizon; lane lines persist while the vehicle is in motion, etc. As for the dotted line, if a certain condition is satisfied, its state is set to be stable. For example, no obstacle is arranged in front of the lane line; lane lines disappear when the vehicle continues to move, and the like. At this time, each lane line has three pieces of information of category, stability, and position.
And inputting the video frame into the segmentation model to obtain the lane line information, and simultaneously storing the attribute parameters of the lane line of the historical frame. The information of each frame of lane line subsequently depends on the output result of the model, and the constraint of the information of the lane line of the historical frame is also needed. The prior knowledge of the constraint is that no sudden changes occur for the type of lane line on the same lane.
Defining a buffer queue for storing each frame of lane line information, obtaining the attribute parameters of the lane lines, then directly outputting the attribute parameters, storing the attribute parameters into the buffer queue, and outputting the attribute parameters of the lane lines of the video frames in the head of the buffer queue when the buffer queue is full.
In the process of storing the attribute parameters of the lane lines of the frames into the buffer queue, if the stability of the lane lines of the frames is stable in the future, the stability and the type of each frame of lane lines in the buffer queue are corrected. Therefore, the output of the video can be corrected by utilizing the lane line information of the future frame, and the accuracy of the lane attribute parameters of the current frame is ensured.
Referring to fig. 5, in an embodiment of the present invention, the present invention further provides a computer device, where the computer device 12 is represented in a form of a general-purpose computing device, and components of the computer device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus 18 structures, including a memory bus 18 or memory controller, a peripheral bus 18, an accelerated graphics port, and a processor or local bus 18 using any of a variety of bus 18 architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus 18, micro-channel architecture (MAC) bus 18, enhanced ISA bus 18, audio Video Electronics Standards Association (VESA) local bus 18, and Peripheral Component Interconnect (PCI) bus 18.
Computer device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)31 and/or cache memory 32. Computer device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (commonly referred to as "hard drives"). Although not shown in FIG. 5, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. The memory may include at least one program product having a set (e.g., at least one) of program modules 42, with the program modules 42 configured to carry out the functions of embodiments of the invention.
A program/utility 41 having a set (at least one) of program modules 42 may be stored, for example, in memory, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules 42, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.
Computer device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, camera, etc.), with one or more devices that enable a user to interact with computer device 12, and/or with any devices (e.g., network card, modem, etc.) that enable computer device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, computer device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN)), a Wide Area Network (WAN), and/or a public network (e.g., the Internet) via network adapter 20. As shown, the network adapter 21 communicates with the other modules of the computer device 12 via the bus 18. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with computer device 12, including but not limited to: microcode, device drivers, redundant processing units 16, external disk drive arrays, RAID systems, tape drives, and data backup storage systems 34, etc.
The processing unit 16 executes various functional applications and data processing, such as implementing a method of identifying lane lines provided by an embodiment of the present invention, by executing programs stored in the system memory 28.
That is, the processing unit 16 implements, when executing the program: acquiring a video, and determining attribute parameters of a lane line according to video frame information in the video, wherein the attribute parameters of the lane line comprise the type of the lane line, the stability of the lane line and the position of the lane line;
determining attribute parameters of the historical frame lane lines and attribute parameters of the future frame lane lines according to the attribute parameters of the lane lines;
and determining the attribute parameters of the current frame lane line according to the attribute parameters of the historical frame lane line and the attribute parameters of the future frame lane line.
In an embodiment of the present invention, the present invention further provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor, implements the method for identifying lane lines as provided in all embodiments of the present application.
That is, the program when executed by the processor implements: the method comprises the steps of obtaining a video, and determining attribute parameters of a lane line according to video frame information in the video, wherein the attribute parameters of the lane line comprise the type of the lane line, the stability of the lane line and the position of the lane line.
And determining the attribute parameters of the historical frame lane lines and the attribute parameters of the future frame lane lines according to the attribute parameters of the lane lines.
And determining the attribute parameters of the current frame lane line according to the attribute parameters of the historical frame lane line and the attribute parameters of the future frame lane line.
Any combination of one or more computer-readable media may be employed. The computer readable medium may be a computer-readable storage medium or a computer-readable signal medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPOM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The method, the apparatus, the device and the storage medium for identifying lane lines provided by the present invention are introduced in detail, and a specific example is applied in the present document to explain the principle and the implementation of the present invention, and the description of the above embodiment is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. A method for identifying lane markings, the method being applied to the detection of lane markings during autonomous driving, comprising:
acquiring a video, and determining attribute parameters of a lane line according to video frame information in the video, wherein the attribute parameters of the lane line comprise the type of the lane line, the stability of the lane line and the position of the lane line;
determining attribute parameters of the historical frame lane lines and attribute parameters of the future frame lane lines according to the attribute parameters of the lane lines;
and determining the attribute parameters of the current frame lane line according to the attribute parameters of the historical frame lane line and the attribute parameters of the future frame lane line.
2. The method of claim 1, wherein the step of determining the attribute parameters of the lane lines according to the video frame information in the video comprises:
identifying the category of the lane line, and identifying the lane line as a solid line or a dotted line according to video frame information in the video;
identifying the stability of the lane line, and determining whether the stability of the lane line reaches a preset value according to video frame information in the video;
and identifying the position of the lane line, and detecting the position information of the lane line according to the video frame information in the video.
3. The method of claim 2, wherein the step of identifying the stability of the lane line and determining whether the stability of the lane line reaches a predetermined value according to video frame information in the video comprises:
if the type of the lane line is a solid line, when the far end of the lane line disappears on the horizon line and/or the lane line continuously exists during the movement of the vehicle and/or the length of the lane line reaches a preset value, setting the stability mark as stable, and otherwise, setting the stability of the lane line as unstable;
if the type of the lane line is a dotted line, no obstacle is arranged in front of the lane line and/or the lane line disappears when the vehicle continuously moves, the stability of the lane line is set to be determined, and otherwise, the stability of the lane line is set to be uncertain.
4. The method of claim 2, wherein the step of determining the attribute parameters of the current frame lane line according to the attribute parameters of the historical frame lane line and the attribute parameters of the future frame lane line comprises:
determining the attribute parameters of a first lane line by using the attribute parameters of the historical frame lane line;
and correcting the attribute parameters of the first lane line according to the attributes of the future frame lane line to determine the attribute parameters of the current frame lane line.
5. The method of identifying a lane line according to claim 4, wherein the step of determining the attribute parameter of the first lane line using the attribute parameter of the lane line of the history frame comprises:
determining the stability of the first lane line and the position of the first lane line according to video frame information in the video;
determining the category of the first lane line according to the category of the historical frame lane line to obtain the attribute parameters of the first lane line.
6. The method of claim 4, wherein the step of determining the attribute parameter of the current frame lane line by modifying the attribute parameter of the first lane line according to the attribute of the future frame lane line comprises:
and if the stability of the future frame lane line is stable and the stability of the first lane line is unstable, correcting the type of the first lane line and the stability of the first lane line according to the type of the future frame lane line and the stability of the future frame lane line to obtain the attribute parameters of the current frame lane line.
7. The method of claim 6, wherein the current frame lane line attribute parameters comprise: the type of the current frame lane line, the stability of the current frame lane line and the position of the current frame lane line.
8. An apparatus for recognizing a lane line, comprising:
an attribute module: the system comprises a video acquisition module, a lane line detection module, a lane line display module and a lane line display module, wherein the video acquisition module is used for acquiring a video and determining attribute parameters of the lane line according to video frame information in the video, and the attribute parameters of the lane line comprise the type of the lane line, the stability of the lane line and the position of the lane line;
a segmentation module: the attribute parameters of the historical frame lane lines and the attribute parameters of the future frame lane lines are determined according to the attribute parameters of the lane lines;
a determination module: and the attribute parameters of the current frame lane line are determined according to the attribute parameters of the historical frame lane line and the attribute parameters of the future frame lane line.
9. Electronic device, characterized in that it comprises a processor, a memory and a computer program stored on said memory and capable of running on said processor, said computer program, when executed by said processor, implementing the steps of the method of identifying lane lines according to any one of claims 1 to 7.
10. Computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, carries out the steps of the method of identifying a lane line according to any one of claims 1 to 7.
CN202010789662.7A 2020-08-07 2020-08-07 Method, device, equipment and storage medium for identifying lane line Pending CN111814746A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202010789662.7A CN111814746A (en) 2020-08-07 2020-08-07 Method, device, equipment and storage medium for identifying lane line
PCT/CN2020/123251 WO2021151321A1 (en) 2020-08-07 2020-10-23 Method and apparatus for identifying lane line, and device and storage medium
CN202011204772.9A CN112200142A (en) 2020-08-07 2020-11-02 Method, device, equipment and storage medium for identifying lane line

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010789662.7A CN111814746A (en) 2020-08-07 2020-08-07 Method, device, equipment and storage medium for identifying lane line

Publications (1)

Publication Number Publication Date
CN111814746A true CN111814746A (en) 2020-10-23

Family

ID=72863897

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202010789662.7A Pending CN111814746A (en) 2020-08-07 2020-08-07 Method, device, equipment and storage medium for identifying lane line
CN202011204772.9A Pending CN112200142A (en) 2020-08-07 2020-11-02 Method, device, equipment and storage medium for identifying lane line

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202011204772.9A Pending CN112200142A (en) 2020-08-07 2020-11-02 Method, device, equipment and storage medium for identifying lane line

Country Status (2)

Country Link
CN (2) CN111814746A (en)
WO (1) WO2021151321A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115797506A (en) * 2022-12-16 2023-03-14 江苏泽景汽车电子股份有限公司 Method and device for drawing lane line object, terminal equipment and storage medium

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111814746A (en) * 2020-08-07 2020-10-23 平安科技(深圳)有限公司 Method, device, equipment and storage medium for identifying lane line
CN113602267B (en) * 2021-08-26 2023-01-31 东风汽车有限公司东风日产乘用车公司 Lane keeping control method, storage medium, and electronic apparatus
CN113780313A (en) * 2021-09-18 2021-12-10 东软睿驰汽车技术(沈阳)有限公司 Line feature extraction method and device and electronic equipment
CN114141009B (en) * 2021-10-31 2023-01-31 际络科技(上海)有限公司 Simulation traffic flow lane changing method and system based on multi-time sequence network
CN114644019B (en) * 2022-05-23 2022-08-02 苏州挚途科技有限公司 Method and device for determining lane center line and electronic equipment

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8311283B2 (en) * 2008-07-06 2012-11-13 Automotive Research&Testing Center Method for detecting lane departure and apparatus thereof
CN109670376B (en) * 2017-10-13 2021-05-25 神州优车股份有限公司 Lane line identification method and system
CN108470159B (en) * 2018-03-09 2019-12-20 腾讯科技(深圳)有限公司 Lane line data processing method and device, computer device and storage medium
CN108875603B (en) * 2018-05-31 2021-06-04 上海商汤智能科技有限公司 Intelligent driving control method and device based on lane line and electronic equipment
CN110160542B (en) * 2018-08-20 2022-12-20 腾讯科技(深圳)有限公司 Method and device for positioning lane line, storage medium and electronic device
CN109147368A (en) * 2018-08-22 2019-01-04 北京市商汤科技开发有限公司 Intelligent driving control method device and electronic equipment based on lane line
CN109409202B (en) * 2018-09-06 2022-06-24 惠州市德赛西威汽车电子股份有限公司 Robust lane line detection method based on dynamic interesting area
CN109409205B (en) * 2018-09-07 2021-11-26 东南大学 Aerial video highway lane line detection method based on line interval feature point clustering
CN110533925B (en) * 2019-09-04 2020-08-25 上海眼控科技股份有限公司 Vehicle illegal video processing method and device, computer equipment and storage medium
CN111160086B (en) * 2019-11-21 2023-10-13 芜湖迈驰智行科技有限公司 Lane line identification method, device, equipment and storage medium
CN111291681B (en) * 2020-02-07 2023-10-20 北京百度网讯科技有限公司 Method, device and equipment for detecting lane change information
CN111814746A (en) * 2020-08-07 2020-10-23 平安科技(深圳)有限公司 Method, device, equipment and storage medium for identifying lane line

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115797506A (en) * 2022-12-16 2023-03-14 江苏泽景汽车电子股份有限公司 Method and device for drawing lane line object, terminal equipment and storage medium
CN115797506B (en) * 2022-12-16 2023-11-17 江苏泽景汽车电子股份有限公司 Method, device, terminal equipment and storage medium for drawing lane line object

Also Published As

Publication number Publication date
WO2021151321A1 (en) 2021-08-05
CN112200142A (en) 2021-01-08

Similar Documents

Publication Publication Date Title
CN111814746A (en) Method, device, equipment and storage medium for identifying lane line
CN110163176B (en) Lane line change position identification method, device, equipment and medium
US11126876B2 (en) Method for recognizing traffic light, device, and vehicle
CN110781768A (en) Target object detection method and device, electronic device and medium
US10867393B2 (en) Video object detection
CN110533940B (en) Method, device and equipment for identifying abnormal traffic signal lamp in automatic driving
CN112380981A (en) Face key point detection method and device, storage medium and electronic equipment
CN109635861B (en) Data fusion method and device, electronic equipment and storage medium
US11361555B2 (en) Road environment monitoring device, road environment monitoring system, and road environment monitoring program
CN113971727A (en) Training method, device, equipment and medium of semantic segmentation model
CN109684944B (en) Obstacle detection method, obstacle detection device, computer device, and storage medium
CN112312001B (en) Image detection method, device, equipment and computer storage medium
CN110232368B (en) Lane line detection method, lane line detection device, electronic device, and storage medium
CN108734161B (en) Method, device and equipment for identifying prefix number area and storage medium
CN112507852A (en) Lane line identification method, device, equipment and storage medium
CN109635868B (en) Method and device for determining obstacle type, electronic device and storage medium
CN109215368B (en) Driving assistance method, device, equipment and computer storage medium
CN109270566B (en) Navigation method, navigation effect testing method, device, equipment and medium
CN109934185B (en) Data processing method and device, medium and computing equipment
CN115311634A (en) Lane line tracking method, medium and equipment based on template matching
WO2023066080A1 (en) Forward target determination method and apparatus, electronic device and storage medium
CN111488776A (en) Object detection method, object detection device and electronic equipment
CN113343986B (en) Subtitle time interval determining method and device, electronic equipment and readable storage medium
CN115249407B (en) Indicator light state identification method and device, electronic equipment, storage medium and product
CN114743174A (en) Determination method and device for observed lane line, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20201023